hi,guys,
I have two array that contains pixels that glTexImage2D needed .
As u know , the two are equal in the memory . i pass pixels data to glTexImage2D .
but they produce different results.
i dont’know why .
GLubyte pixels[4 * 4] =
{
255, 0, 0 , 255, // Red
0, 255, 0, 255,// Green
0, 0, 255, 255,// Blue
255, 255, 0 ,255// Yellow
};
You are wrong, those two arrays are not equivalent on a system using little-endian format (Endianness - Wikipedia). Also, this is not really an OpenGL question.
second array is GLuint and you provide GL_UNSIGNED_BYTE as a parameter to glTexImage2D. so it interprets second array incorrectly. you’d better converting Gluint array to GLubyte before feeding it to glTexImage2D. opengl is not very friendly with integer data(it’s a bit trickier to work with and it requires extensions) and i doubt you really need it.
Ignore the comment of Nowhere-01, he/she is confusing things. The problem is the endianness. There is no “incorrect interpretation”. Everything is well defines and what he is referring to (integer texture formats) are a completely different thing.
ok. i didn’t pay attention to contents of arrays. but i don’t get why would you do it like that. confusing comment was removed. it is really endianness issue.