Converting 24 bit to 16 bit color information ?

Hi,

I need to convert 24 bit color information “RGB” to 16 bit color information.
How do i attain this?
Should i do my own calculations on it (or)
Is there a standard methodology to do this?

Only the 16 bit color representation is supported in particular embedded platforms, not beyond that.

Need your help.

Thanks in advance
mailmessb

What you want to do is color quantization. I advise you to look in this area but I am not sure to see in what way it is related to OpenGL.

Thanks for the information.
But that gives the theoretical explanation…

What about this below logic…?

24 bit representation of RGB has RRRRRRRR-GGGGGGGG-BBBBBBBB, 8 bits assigned for each primary color.
If it has to be converted to 16 bit representation, it has to be RRRRR-GGGGGG-BBBBB (5-6-5) like this.

So fetch the color components from 24 bit color representation and shift accordingly to get the 16 bit representation

RRRRRRRR >> 3 –> RRRRR (5)
GGGGGGGG >> 2 –> GGGGGG (6)
BBBBBBBB >> 3 –> BBBBB (5)

There will be loss of color information (as per theory), but i doubt how far this logic is correct?

Verify this page

Comments plz

Regards
mailmessb

the old 3dfx cards did hardware dithering
that’s a great wake of faking more colours

old 3dfx cards could only do something like 12bits total so yes that was needed :slight_smile:

On the early 3dfx chips all the calculations were done in 24 or 32bit, then dithered to 16bit before hitting the frame buffer. This meant that 3dfxs 16bit colour was superior to that of the competitions. Quite clever. Considering the hardware of the time, early voodoo1 card with 4meg of ram, it was a good solution. Just later on 16bit colour sucked :eek:

the voodoo3 did this
http://www.beyond3d.com/images/articles/3DFX22bit/pipeline.gif

Most strange :slight_smile:

Are you sure this system was present of first generation 3DFX voodoo ?
My memories of the 3DFX1 I had were that subtle texture gradients were very ugly and banding, but I may not be accurate. 16bits textures would look crap even with 24bits compuations anyway…

Nowadays, 24bits is very easy on desktop platform, not costing much in terms of framebuffer space relative to VRAM.
And if your embedded device does 16bits, it is for both memory and speed, so it does not make sense to do computations in 24bits.
There was a time around Quake3, I saw a Mac with 16bits temporal dithering, that was quite nice on the eyes. It just breaks the reproductiblility.