Hi Everyone,
I am new to fragment shaders, myself. I am trying to write a shader to apply a large LUT to an image. As a matter of fact, I want to apply 2 LUTs to an image, I apply the first LUT, do some operations, then apply the second LUT.
Problem 1:
What is the best way to apply a 65536 (16-bit deep) 1D LUT (in a texture) to an image? The complication is that the Texture size is often smaller than 65536. (ie: a Texture buffer width of 8192 will be 8 pixels high in order to hold a 65536 sized 1D Lookup Table.) I have worked out a way to use a cascading “if” statements to offset the pixel value into the texture to perform the lookup. This doesn’t seem to be very efficient. Is there a better way?
Problem 2:
In my shader, I need to perform 2 different LUT operations at different times in my color processing chain. In order to be efficient, I created a GL_TEXTURE_RECTANGLE_EXT texture, 16 x 8192 in size. Both LUTs are 8x8192, so I use glTexSubImage2D to place them side by side. I use the method I described above, to do the LUT.
Using an ATI graphics card, my 2 lookups work perfectly. On an nVidia card, the first Lookup from using coords 0,0 to 8192,8 works perfectly. But my second Lookup fails, returning garbage. Is there some limitation on nVidia that I should be aware of? It almost seems like the glTexSubImage2D is failing. But the weird part is that if I comment out the first Lookup, the second works.
I am stumped. (I wish there was a GLSL debugger)
bob.