3D and 1D color Lookups implementation

HI,
regarding the following…
http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter24.html

How would one write the cg code for the 3Dlut in a GLSL FP and VP?

Thanks,
Adrian.

It’s just a texture fetch. Pass the color that you want to process as a texture coordinate into a volumetric texture and you’re done.

Thanks Humus,
Im afraid as my ‘newbie’ status declares Im going to need a more
pedestrian “dumbed down” reply.
Question: how does one translate the “Half” variables in the .cg code?

Thanks for your patience.

adrian

Half is just a lower-precision float.
Something like this should work.


varying vec2 texCoord;

uniform sampler2D Image;
uniform sampler3D LookupTable;

void main()
{
	// Sample image
	vec3 color = texture2D(Image, texCoord).rgb;

	// Adjust color with lookup table
	vec4 adjusted_color = texture3D(LookupTable, color);

	gl_FragColor = adjusted_color;
}

Thanks Humus,

using the vertex shader as below I get a blank output in Shader Designer
when I input my image in texture1 and the Clut in texture2. The code reads “OK”.
I seem to recall someone saying that Shader designer doesnt support 3D textures. If this is so how can I visualise this fragment shader over an image.
I have tried OSX shader designer and he win32 ATI version as well.

Also, In your interpretation of the Nvidia reference you dismissed the ‘LUTSIZE’, why is this? Does it not matter that the image containing the LUT may be of a different physical proportion.
iE: 8x8x8 or 16x16x16 or 17x17x17 RGB color lookup array.

Regards
Adrian

////////////////
//Vertex Shader

varying vec2 texCoord;

void main()
{
texCoord = gl_MultiTexCoord0.xy;
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}

The size does not matter since texture are looked up with normalized texture coordinates in 0…1 range regardless of size.

Thanks again Humus,

Should a tool like ‘Shader Builder’ allow me to visualise this 3DLookup over a still image.
I was assuming I could implement the ‘image’ and lookup texture on tex0 and Tex1 using the Frag_P and Vert_P and see the result mapped on a plane.

Do I have a fundamental misunderstanding of how a tool like OSX shader Builder or typhoon labs ‘Shader Designer’ is expected to work in this way? Is there a more appropriate visualization tool I should look at?

Basically at the moment I get no output when using the Fragment and vertex code.

Thanks,

Adrian

Update,
I just downloaded the new version osx Dev tools 3.1 and 3d textures are supported.
OK.

Will get back later.

adrian

Well its kind of working…
Humus could you also write a vertex program to
go with your Fragment processor…Pleeeese?

Thanks
Adrian


varying vec2 texCoord;
void main() {
 texCoord = gl_Vertex.xy;
 gl_Position = vec4(gl_Vertex.xy * 2.0 - 1.0, 0.0, 1.0);
} 

and render a quad like this:


glBegin(GL_QUADS);
glVertex2f(0.0, 0.0);
glVertex2f(0.0, 1.0);
glVertex2f(1.0, 1.0);
glVertex2f(1.0, 0.0);
glEnd();

Hi There oc2k1,
Thats an odd name…

Thanks,

Where do I place the ‘render a quad’ code?
Im in osx shader builder 3.1.

Thanks for everbodies patience.

Adrian

Hi There oc2k1,

Thanks,

Where do I place the ‘render a quad’ code?
Im in osx shader builder 3.1.

Thanks for everbodies patience.

Adrian

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.