View Full Version : Converting 24 bit to 16 bit color information ?
10-20-2009, 10:42 PM
I need to convert 24 bit color information "RGB" to 16 bit color information.
How do i attain this?
Should i do my own calculations on it (or)
Is there a standard methodology to do this?
Only the 16 bit color representation is supported in particular embedded platforms, not beyond that.
Need your help.
Thanks in advance
10-21-2009, 03:50 AM
What you want to do is color quantization (http://en.wikipedia.org/wiki/Color_quantization). I advise you to look in this area but I am not sure to see in what way it is related to OpenGL.
10-21-2009, 04:10 AM
Thanks for the information.
But that gives the theoretical explanation..
What about this below logic..?
24 bit representation of RGB has RRRRRRRR-GGGGGGGG-BBBBBBBB, 8 bits assigned for each primary color.
If it has to be converted to 16 bit representation, it has to be RRRRR-GGGGGG-BBBBB (5-6-5) like this.
So fetch the color components from 24 bit color representation and shift accordingly to get the 16 bit representation
RRRRRRRR >> 3 --> RRRRR (5)
GGGGGGGG >> 2 --> GGGGGG (6)
BBBBBBBB >> 3 --> BBBBB (5)
There will be loss of color information (as per theory), but i doubt how far this logic is correct?
Verify this page
10-21-2009, 09:28 AM
the old 3dfx cards did hardware dithering
that's a great wake of faking more colours
10-21-2009, 09:49 AM
old 3dfx cards could only do something like 12bits total so yes that was needed :)
10-21-2009, 04:46 PM
On the early 3dfx chips all the calculations were done in 24 or 32bit, then dithered to 16bit before hitting the frame buffer. This meant that 3dfxs 16bit colour was superior to that of the competitions. Quite clever. Considering the hardware of the time, early voodoo1 card with 4meg of ram, it was a good solution. Just later on 16bit colour sucked :eek:
the voodoo3 did this
Most strange :)
10-22-2009, 04:36 AM
Are you sure this system was present of first generation 3DFX voodoo ?
My memories of the 3DFX1 I had were that subtle texture gradients were very ugly and banding, but I may not be accurate. 16bits textures would look crap even with 24bits compuations anyway...
Nowadays, 24bits is very easy on desktop platform, not costing much in terms of framebuffer space relative to VRAM.
And if your embedded device does 16bits, it is for both memory and speed, so it does not make sense to do computations in 24bits.
There was a time around Quake3, I saw a Mac with 16bits temporal dithering, that was quite nice on the eyes. It just breaks the reproductiblility.
Powered by vBulletin® Version 4.2.2 Copyright © 2015 vBulletin Solutions, Inc. All rights reserved.