Hi!
I’m implementing a HDR renderpath using framebuffer objects and GLSL. I want to add effects like tonemapping and glow so I need to create a downsampled image of my scene to do blurring etc.
I downsample the scene to a 1/4 of its size with an extra shader which reads and averages a 4x4 box of texels.
Since modern cards support bilinear filtering for hdr textures (I use the RGBA16F format from the arb extension) I want to write a second shader which does 4 texture lookups instead of
16.
I want to place the 4 sampling points in the edges of 4 texels:
(the blue pixel is the “current” one, red marks the sampling points)
Which gives me the following code in my vertexshader:
gl_TexCoord[0].xy = gl_MultiTexCoord0.xy + vec2(0.5, 0.5) * texel_size;
gl_TexCoord[1].xy = gl_MultiTexCoord0.xy + vec2(0.5, 2.5) * texel_size;
gl_TexCoord[2].xy = gl_MultiTexCoord0.xy + vec2(2.5, 2.5) * texel_size;
gl_TexCoord[3].xy = gl_MultiTexCoord0.xy + vec2(2.5, 0.5) * texel_size;
However this does not yield the same results as the 16-lookup shader although I set the texture to GL_LINEAR/GL_LINEAR for the 4-lookup shader and GL_NEAREST,GL_NEAREST for the 16-lookup shader.
After some exausting trial and error I’ve found the correct code:
gl_TexCoord[0].xy = gl_MultiTexCoord0.xy + vec2(0.0, 0.0) * texel_size;
gl_TexCoord[1].xy = gl_MultiTexCoord0.xy + vec2(0.0, 2.0) * texel_size;
gl_TexCoord[2].xy = gl_MultiTexCoord0.xy + vec2(2.0, 2.0) * texel_size;
gl_TexCoord[3].xy = gl_MultiTexCoord0.xy + vec2(2.0, 0.0) * texel_size;
This makes no sense to me.
My understanding of texturing is that opengl samples from the
center of the texel where as directx samples from the lower left corner. So this offsets should work with directx but not opengl.
I read the spec and can’t find my mistake, I guess its simple and i just don’t see it.
Please help!
This image show the delta between the 16 and 4-lookup shader.