Hi all!
Trying out some features of 3.x which is really neat.
The problem i’m having currently is that i get weird values when i read back the texture with type GL_RED (GL_R8UI) when i in the shader have declared it as
out uint var;
and set it with
var = uint(128);
for example, but the values i read back are a bit random.
The shader has another output of vec3 that works perfectly, glGetFragDataLocation reports the index of 0 and 1 for them correctly. The framebuffer is valid since it is complete and the vec3 buffer displays correctly. I also disabled ClampColor for GL_READ_COLOR and i’ve set GL_PACK_ALIGNMENT to 1.
So my question is, should i fall back to using a float instead? I really would like to use a single integer here cause it is more suitable for the calculations im doing (It’s not game related)
Thanks in regards
//Johan
EDIT: Floats didn’t work much better though, using glReadPixels with the framebuffer bound did work better than reading from the texture, the values we’re consistent. They did not match the values in the shader though, so some sort of clamping/scaling is done.
I did find this thread talking about bugs in the driver:
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=194053&page=2