glColorPointer and color format

Hello,
I have a problem when using glColorPointer to specify per-vertex colors. I’m using WinXP, nVidia drivers 23.11 and VC++ 6.0. My application runs in accelerated window mode with a 32 bit color buffer, 24 bit depth buffer and 8 bit stencil buffer.
All my vertices have a DWORD that represents the per-vertex, primary color in ARGB format (MSB is A, next byte is R, and so on).
Now. when I use glColorPointer to specify colors it reads the DWORD as ABGR and not as ARGB. How come? And is there a way to change this without changing the way the colors are stored in a DWORD (can’t do that for various reasons)?
Here’s how I call glColorPointer:
glColorPointer(4, GL_UNSIGNED_BYTE, sizeof(VertexType), pBuffer);
pBuffer is a pointer to an array of vertices with the components interleaved. Before the call the pointer pBuffer gets adjusted so that it points to the color component of the first vertex.

Thanks for any help.
Regards,
Asgard

I belive this can be done with vertex program like this:

const GLubyte
vertex_program_ABGR_to_ARGB[] =
“!!VP1.1”
“OPTION NV_position_invariant;”
“MOV o[COL0].xyzw, v[COL0].xwzy;”
“END”;

You read color as ABGR and just swap B and R values to get ARGB.
Maybe there is another way but I cant think of it right now.
Hope this help.

Hello,

I was thinking about using vertex shaders, and that would definitely be an option when the user of my engine decides to use programmable shaders. However, I also want the engine to offer the option of using the fixed function pipeline, and then I can’t use a vertex program.
I still don’t understand why the color format is ABGR and not ARGB (which I - admittedly coming from DirectX coding - always thought of as being standard) or RGBA.
Maybe I’m using a wrong pixel format. I’ll have to try under Linux and see what format is used there with glColorPointer.

Anyway, thanks for your reply. Maybe somebody else has some ideas?

Regards,
Asgard

OpenGL doesn’t seem to accept packed ‘pixel’ formats (eg GL_UNSIGNED_INT_8_8_8_8_REV) in the glColorPointer call. You could experiment with that but, as it’s not mentioned in the spec, I won’t recommend it, even if it should turn out to work for you. So you’re out of luck here.

It still strikes me as being odd that the standard color order should be anything else but RGBA (taken as a sequence of bytes), which it is in case of glColorPointer(blabla,GL_UNSIGNED_BYTE,bla,bla).

I think you should definitely try and explore your options to get the colors swapped around before putting them in the vertex array.