back to the roots, i am using 16 colors indexed textures, and i was disturbed about some fillrates issues.. In fact, my GeForce boards doesn't seem to make strong differences by handling 4bits textures or 16bits textures into a scene!! so what?
when i specify GL_COLOR_INDEX4_EXT does OpenGL really uses 4bits data internaly or something bad happens?
any Nvidia coder around could help?
Please don't tell me that this extension was patched to get more extension defines in the commercial package! )