Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 7 of 7

Thread: 10 bits dpx texture

  1. #1
    Junior Member Newbie
    Join Date
    May 2011
    Posts
    18

    10 bits dpx texture

    Hi,

    I need to display 10 bits texture coming from 10 Bits dpx file.

    1)- I have a firepro, so I can enable 10 bits opengl output (that's what I wants to do add the end). For now I have disable it, and it hopes to see the 10 bits texture in a 8 bits per pixel opengl view.
    I can see something but the color are strange and it's not a simple swap of color ( for example part of a grayscale have color !) . I use :

    GL_UNSIGNED_INT_10_10_10_2 (I have also tryed GL_UNSIGNED_INT_2_10_10_10_REV)texture format : GL_RGBA and internal format : GL_RGB10_A2
    Do I need a special shader to display the 10 bits texture on a 8 bits /channels fbo ?


    2)- I need to use alpha channel for compositing : I have intermediate fbo : What format should I use to preserve the 10 bits texture and have access to full range alpha for compositing : Do I need to use for example RGBA12 or RGBA16 ?

  2. #2
    Intern Contributor
    Join Date
    May 2008
    Location
    USA
    Posts
    99
    Quote Originally Posted by qnext View Post
    Hi,

    I need to display 10 bits texture coming from 10 Bits dpx file.

    1)- I have a firepro, so I can enable 10 bits opengl output (that's what I wants to do add the end). For now I have disable it, and it hopes to see the 10 bits texture in a 8 bits per pixel opengl view.
    I can see something but the color are strange and it's not a simple swap of color ( for example part of a grayscale have color !) . I use :

    GL_UNSIGNED_INT_10_10_10_2 (I have also tryed GL_UNSIGNED_INT_2_10_10_10_REV)texture format : GL_RGBA and internal format : GL_RGB10_A2
    Do I need a special shader to display the 10 bits texture on a 8 bits /channels fbo ?


    2)- I need to use alpha channel for compositing : I have intermediate fbo : What format should I use to preserve the 10 bits texture and have access to full range alpha for compositing : Do I need to use for example RGBA12 or RGBA16 ?
    If your texture format isn't natively supported on a platform you're targeting (as your weird colors indicate) then yes, you'd need to write a shader. It isn't just a colorspace problem? For instance, would a dpx be in Cinema XYZ (?) format, not RGB?

    You should also look to do your uploads in a format that the driver won't 'swizzle' if you're hoping for high speed.

    I recommend 'HALF' as an interim FBO/texture format. It's 16-bit float RGBA. Good mix of size and fidelity, and supported pretty well across the board these days.

    Bruce

  3. #3
    Senior Member OpenGL Guru
    Join Date
    May 2009
    Posts
    4,948
    Um, GL_RGB10_A2 and GL_RGBA16 are required image formats for OpenGL 3.0+. So they exist and are supported on anything released since 2008.

  4. #4
    Member Regular Contributor malexander's Avatar
    Join Date
    Aug 2009
    Location
    Ontario
    Posts
    327
    I recommend 'HALF' as an interim FBO/texture format. It's 16-bit float RGBA. Good mix of size and fidelity, and supported pretty well across the board these days.
    I agree. We use FLOAT16 for representing 10b log Cineon images, which is a good format for playback speed and memory use, and it provides a good base for further color correction if needed.

  5. #5
    Junior Member Newbie
    Join Date
    May 2011
    Posts
    18
    thanks for all your answers ... I have tried to have an fbo with float16 ... but I have always theses stranges colors ....
    A-Do I need to have a 30 bits display screen to have true color or can I simulate this on a normal screen even if I don't display the full range
    B- is GL_RGB10_A2 a 10b log format like it seems I have on dpx ?
    C- Do I need a 30 bits buffer and fbo in 16 bits to display a normal color with a texture in GL_RGB10_A2 ? or I should see a normal image (with only loss in precision) with an standard opengl format ?

  6. #6
    Intern Contributor
    Join Date
    May 2008
    Location
    USA
    Posts
    99
    No, if your texture is being read properly, then your target format should be irrelevant.

    Run a color bars still through your pipeline and post a picture.

  7. #7
    Member Regular Contributor
    Join Date
    Jun 2013
    Posts
    498
    Quote Originally Posted by qnext View Post
    A-Do I need to have a 30 bits display screen to have true color or can I simulate this on a normal screen even if I don't display the full range
    OpenGL won't care if the physical framebuffer is only 24-bpp, and will care even less about what the monitor supports.

    Quote Originally Posted by qnext View Post
    B- is GL_RGB10_A2 a 10b log format like it seems I have on dpx ?
    No. It's just 10 bits for each of R,G,B and 2 bits for A, packed into a 32-bit word. It's not even sRGB, let alone logarithmic. OpenGL doesn't directly support logarithmic textures, but you can convert the values in a shader. Note that you'll also have to ensure that the result is correctly converted to sRGB (or whatever the monitor uses).

    Quote Originally Posted by qnext View Post
    C- Do I need a 30 bits buffer and fbo in 16 bits to display a normal color with a texture in GL_RGB10_A2 ? or I should see a normal image (with only loss in precision) with an standard opengl format ?
    Rendering and copying don't care about the number of bits. For unsigned normalised formats, all components are treated as real numbers in the range 0..1. The number of bits just determines the precision.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •