UNSIGNED_INT_10_10_10_2 packing

Can someone offer any experiences with this packing format?
I have such packed texture data in host memory and I am unable to use it as a GL_RGBA texture. Do I need glTexSubImage2DEXT to use this format? (GL_EXT_subtexture is not supported on my platform…)
This is what I do so far, but I still get a 4 x 8bit representation of the data:

glTexImage2D(GL_TEXTURE_RECTANGLE_ARB, 0, GL_RGBA, 800, 600, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0);
glTexSubImage2D( GL_TEXTURE_RECTANGLE_ARB, 0, 0, 0, 800, 600, GL_BGRA, GL_UNSIGNED_INT_10_10_10_2, (GLuint *)data);

I am unsure about the <format> specified with the glTexSubImage2D() call.

This probably won’t work for two reasons:

  • You specified the texture to be GL_RGBA8 in the glTexImage2D call.
  • Current hardware with RGB10_A2 format is sparse (I don’t know any).

Check this document if you’re on NVIDIA hardware:
http://download.nvidia.com/developer/OpenGL_Texture_Formats/nv_ogl_texture_formats.pdf

You might use FLOAT_RGBA16 (NV_float_buffer) or RGBA_FLOAT16 (ATI_texture_float) format instead.

I think the only gamer video card that supported that format was the Matrox Parhelia, but I’m not sure.

Ah, I just stumbled over GL_UNSIGNED_INT_10_10_10_2 and was hoping there is a way…
I guess I would need RGB10_A2 as internal texture format and that is substituted into RGBA8 on nvidia hardware (thanks for the pdf Relic).
Is it just me or is the texture storage, packing, conversion pipeline less documented than other opengl stuff?
Every time I start using “non standard” texture formats I get confused about <internal format>. Maybe because there have been alot of extensions concerning that area…

The <format> and <type> parameters describe the texture data that you are passing into the GL. The <internalFormat> describes how you want the data to be used by the GL. For the most part, OpenGL can ignore <internalFormat> (the same way that C compilers ignore “register”).

So, if your texture data stored as RGB tripples of floating point data, but you want it to internally store the texture using 10-bit integer RGB data, you would do the following:

glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB10, width, height, 0, GL_RGB, GL_FLOAT );

Now, if the card can do GL_RGB10, it (probably) will. If it can’t, it will substitue some format (maybe GL_RGB10_A2 or GL_RGB8 or …) that it can do.

If your source data is actually stored as packed 10-bit RGB, 2-bit alpha values, you would do:

glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_INT_10_10_10_2 );

You may need to use GL_UNSIGNED_INT_10_10_10_2_REV instead depend on how exactly your data is stored. Notice how the internal format specified is a generic format (GL_RGBA). The driver can use whatever format it wants, but most implementations try to avoid having to change the format of the source data.

ATi cards support 10_10_10_2 as a framebuffer format, but I don’t think they can handle them as true internal texture formats.

We support RGB10_A2 in hardware.