Hey
I have a texture of size 285x32 (maxBasis x basis.size())
and of type unsigned_byte (GLubyte) .
If I load the texture using:
glTexImage2D (GL_TEXTURE_2D, 0, GL_R8UI,
maxBasis, basis.size (),
0, GL_RED_INTEGER_EXT, GL_UNSIGNED_BYTE, baseRefF);
The values on the gpu are a mess, some look good, others changed, and some are just wrong.
Almost looks like it is reading from invalid memory locations.
However, after several attempts to try to understand what is happening (I changed the data type, size, texture, etc. , all ways the same result…), I noticed that if I upload using:
glTexImage1D (GL_TEXTURE_1D, 0, GL_R8UI,
maxBasis basis.size * (),
0, GL_RED_INTEGER_EXT, GL_UNSIGNED_BYTE, baseRefF);
The values are all ok.
In other textures that I use, do not come across any similar problem.
Anyone knows what is going on?
I have an ATI Radeon HD5650 that supports OpenGL 4.1.