View Full Version : Reading Data From a Fragment Shader

07-13-2010, 11:15 AM
OK, First off, my apologies for posting what seems like a really basic problem, but I've done a lot of searches and I just can't seem to find the answer I'm looking for.

My problem is that I want to transfer integer data from a fragment shader to the CPU, but I can't seem to read the data correctly.

Rendering to the texture from the fragment shader using an FBO works fine. I can display the FBO's texture to the screen and everything looks good.

But when I read the values in the buffer, they are not what I would expect.

I set up the texture like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, height, 0, GL_BGRA, GL_UNSIGNED_BYTE, NULL);

For testing, my fragment shader looks like this:

void main()
gl_FragColor.b = 100.0/255.0;
gl_FragColor.g = 10.0/255.0;
gl_FragColor.r = 1.0/255.0;
gl_FragColor.a = 100.0/255.0;

And the code that reads the data:

glBufferData(GL_PIXEL_PACK_BUFFER, width * height * 4 * sizeof(float), NULL, GL_STREAM_READ);
glReadPixels(0, 0, width, height, GL_BGRA, GL_FLOAT, 0);
float* buffer = (float* )glMapBuffer(GL_PIXEL_PACK_BUFFER, GL_READ_ONLY);

My results are:
buffer[0] = 0.15294118
buffer[1] = 0.015686275
buffer[2] = 0.00000000
buffer[3] = 0.76078439
repeats for every 4 floats.

Obviously, I'm doing something wrong... I'm just not sure what.

All I really want to do is write an integer (less than 255, so 8-bits is enough precision) to each channel (RGBA) in the fragment shader and read it back on the CPU. If I could write and read explicit bits this would seem a lot easier, but I'm not sure how to do that in GLSL.

Thanks in advance for any help or a link to how I can figure this out.

07-13-2010, 12:24 PM
I figured out the problem: When the alpha value was not 1.0, the texture was being blended with the cleared FBO. Disabling GL_BLEND fixed my issue.

Thanks to anyone who took time to think about that.