View Full Version : GL_COLOR_MATRIX on what HW is ist supported, how to emulate

04-28-2002, 05:07 AM
I need to do very fast color transformation
in the form R=f(R,G,B) G=f( ... The GL_COLOR_MATRIX shuld be optimal. Unfortunately I do not know on what HW
is it HW accelerated, it looks like it is not in HW on nVidia.
Does naybody know some HW with this function ? Odoes anybody have an idea for
enought fast replacement ?

04-28-2002, 06:38 AM
well.. if your matrix is constant and a simple 3x3-matrix, you can simply set the constant_colors in the different general combiners (nv_register_combiners) to the rows of the matrix, and dot3 them.. (on a gf3+ thats no problem, because you have enough constants.. on a gf1+ you need to use for example 2 constants and secondary_color or something like that..)

_POSSIBLY_ possible in the texture_shaders as well but i'm not sure at all this time (brain is just too bored to work today http://www.opengl.org/discussion_boards/ubb/wink.gif)

i dont think its hardware-accelerated in any consumer-hardware..

ati radeon 8500 should be no problem to implement as well..