View Full Version : glColor with unsigned int

07-17-2006, 08:32 PM
Is there any way, or an extension that will allow you to specify a per vertex color as a singular unsigned int RGBA (0xRRGGBBAA) ala. Direct3D style??

Thanx much.

07-17-2006, 09:23 PM
No. You can store your color as 0xAABBGGRR (little endian) which is what we call RGBA.

Then do
glColor4ubv((GLubyte *)&mycolor);

You can always use a
GLubyte mycolor[4];
which is the same as a
GLuint mycolor;

Yes, I am assuming GLubyte is 8 bit and GLuint is 32 bit

Brian Paul
07-18-2006, 03:57 PM
And be aware of big/little-endian byte ordering.

If you're going to pass a GLuint to glColor4ubv() you'll have to check the system's byte ordering so that you pack the color channels into the GLuint in the right order.