Hi all,
Got a strange bug and can’t figure it out. I attach a 4x4 texture to an FBO, draw a quad the size of the texture, then do a glReadBuffer with the color attachment of the FBO using glReadPixels. I’m using values outside the range of [0, 1], so I’ve unclamped my vertex colors, fragment colors, and read color, using:
glClampColorARB(GL_CLAMP_VERTEX_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_FRAGMENT_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_READ_COLOR_ARB, GL_FALSE);
For debug, everything works great, but for release, it looks like it’s still clamping the values.
In debug I use the color values:
9.3, 25.1, 9.3
9.3, 25.1, -9.3
-9.3, 25.1, -9.3
-9.3, 25.1, 9.3
at the four corners of a 4x4 texture. I read back from the color attachment:
6.97266, 25.0938, 6.97266
6.97266, 25.0938, 2.32422
6.97266, 25.0938, -2.32422
6.97266, 25.0938, -6.97266
2.32422, 25.0938, 6.97266
2.32422, 25.0938, 2.32422
2.32422, 25.0938, -2.32422
2.32422, 25.0938, -6.97266
-2.32422, 25.0938, 6.97266
-2.32422, 25.0938, 2.32422
-2.32422, 25.0938, -2.32422
-2.32422, 25.0938, -6.97266
-6.97266, 25.0938, 6.97266
-6.97266, 25.0938, 2.32422
-6.97266, 25.0938, -2.32422
-6.97266, 25.0938, -6.97266
So it’s obviously interpolated them correctly. In release, I do the same thing, input:
9.3, 25.1, 9.3
9.3, 25.1, -9.3
-9.3, 25.1, -9.3
-9.3, 25.1, 9.3
and get back:
0, 0, 0
1, 1, 1
1, 1, 0
1, 1, 0
1, 1, 1
1, 1, 1
1, 1, 0
1, 1, 0
0, 1, 1
0, 1, 1
0, 1, 0
0, 1, 0
0, 1, 1
0, 1, 1
0, 1, 0
0, 1, 0
So it looks like its clamping everything, even though I thought I’d turned off clamping correctly. I also tried getting it from the texture directly using glBindTexture and glGetTexImage, but same result. The texture is GL_RGB16F_ARB format. Was really hoping someone might have an idea why this would work in debug but not release. I’m running a 7800 GTX with a driver from 8/11/06. Thanks in advance!