glGetTexImage problems

Am I missing something really obvious here ?
(Win2K; GeForceFX 5700LE)

I’m filling up a 16164 array with values, creating a texture from it, reading that texture back to another array, and then comparing the two arrays (original and reread). Why are they not the same ?

Thanks - RS.

  
int tex_size_x = 16, tex_size_y = 16;
int num_bytes = tex_size_x * tex_size_y * 4;
size_t mem_size = num_bytes * sizeof(GLubyte);

GLubyte* OriginalArray = (GLubyte*) malloc(mem_size);
GLubyte* ReadBackArray = (GLubyte*) malloc(mem_size);

for (unsigned int bi=0; bi<num_bytes; bi++)	{OriginalArray[bi] = bi%255;}

// Initialize Texture
unsigned int texture_id;
glEnable(GL_TEXTURE_2D);

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glPixelStorei(GL_PACK_ALIGNMENT, 1);

glGenTextures(1, &texture_id);
glBindTexture(GL_TEXTURE_2D, texture_id);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);

glTexImage2D(GL_TEXTURE_2D, 0, 4, tex_size_x, tex_size_y, 0, GL_RGBA, GL_UNSIGNED_BYTE, OriginalArray);

glDisable(GL_TEXTURE_2D);

// ... some stuff

// Read texture into ReadBackArray
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture_id);
glGetTexImage(	GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, ReadBackArray);
glDisable(GL_TEXTURE_2D);

Should OriginalArray and ReadBackArray be the same here ? (they’re not - see below)

[texel_pos] : (OriginalRGBA) -> (RereadRGBA)
[0][0] : (0, 1, 2, 3)->(0, 0, 0, 0)
[0][1] : (4, 5, 6, 7)->(0, 0, 0, 0)
[0][2] : (8, 9, 10, 11)->(0, 0, 0, 0)
[0][3] : (12, 13, 14, 15)->(0, 0, 0, 0)
[0][4] : (16, 17, 18, 19)->(17, 17, 17, 17)
[0][5] : (20, 21, 22, 23)->(17, 17, 17, 17)
[0][6] : (24, 25, 26, 27)->(17, 17, 17, 17)
[0][7] : (28, 29, 30, 31)->(17, 17, 17, 17)
[0][8] : (32, 33, 34, 35)->(34, 34, 34, 34)
[0][9] : (36, 37, 38, 39)->(34, 34, 34, 34)
[0][10] : (40, 41, 42, 43)->(34, 34, 34, 34)
[0][11] : (44, 45, 46, 47)->(34, 34, 34, 34)
[0][12] : (48, 49, 50, 51)->(51, 51, 51, 51)
[0][13] : (52, 53, 54, 55)->(51, 51, 51, 51)
[0][14] : (56, 57, 58, 59)->(51, 51, 51, 51)
[0][15] : (60, 61, 62, 63)->(51, 51, 51, 51)
[1][0] : (64, 65, 66, 67)->(68, 68, 68, 68)
[1][1] : (68, 69, 70, 71)->(68, 68, 68, 68)
etc etc
  

Check the size of the R, G, B and A components of the texture after you have loaded them into the texture; glGetTexLevelParameter with GL_TEXTURE_{RED|GREEN|BLUE|ALPHA}_SIZE. That will give you the internal format of the texture.

Since you’re not specifying an explicit internal format, just 4 (which is the same as GL_RGBA), OpenGL is free to pick the internal format as it likes. I would guess, by the values you showed us, that OpenGL actually picks GL_RGBA4 as the internal format, meaning 4 bits per component.

If you really need the exact image back, set the internal format to GL_RGBA8. But keep in mind that it’s still just a hint, OpenGL can still choose another format if it likes.

Thanks - yes, specifying RGBA8 works fine.

One more problem :

I’ve switched all GL_UNSIGNED_BYTE to GL_BYTE, and all memory/malloc pointers from GLubyte to GLbyte.
Now I want to write either -127/127 range values with GL_RGBA8, or -3/3 with GL_RGBA4.
The negative values are all being culled at zero on read back.

Any thoughts ?

Thanks - RS.

Texture data is always in range [0, (2^b)-1], where b is the number of bits in the datatype storing a channel of the image. Negative values are therefore clamped to zero.

edit: b above is the number of bits, without sign bit, not the number of total bits in the datatype. For 8-bit signed integers, b = 7, for example.

Got it - thanks again - RS.