Trouble with creating / drawing textures.

I’ve been looking at a number of online resources for how to draw textures in 2D with opengl / glut. They all seem to say the same thing and I can’t see what I’m doing wrong.

Here’s a picture showing what’s going wrong:

Here’s the relevant code:


glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE);
glutInitWindowSize(600, 600);
glutInitWindowPosition(-1, -1);
glutCreateWindow("glut");
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 600, 600, 0, 0, 1);
glMatrixMode(GL_MODELVIEW);
glDisable(GL_DEPTH_TEST);
glutTimerFunc(0, update_func, 0);
glutTimerFunc(0, draw_func, 0);
glEnable(GL_TEXTURE_2D);

...

glGenTextures(1, texture);
glBindTexture(GL_TEXTURE_2D, texture[0]);
glTexImage2D(GL_TEXTURE_2D, 0, 3, 64, 64, 0, GL_RGBA, GL_UNSIGNED_BYTE, row_pointers);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

...

glBindTexture(GL_TEXTURE_2D, texture[0]);
glBegin(GL_QUADS);
glTexCoord2f(0.0, 0.0); glVertex2f(0.0, 0.0);
glTexCoord2f(1.0, 0.0); glVertex2f(64.0, 0.0);
glTexCoord2f(1.0, 1.0); glVertex2f(64.0, 64.0);
glTexCoord2f(0.0, 1.0); glVertex2f(0.0, 64.0);
glEnd();


The image I loaded was a 64 x 64 png file with an alpha channel. As far as I know I’m loading it with libpng correctly as I inspected the row data and found it to be accurate. Any ideas?

Hi,
Try changing this line

glTexImage2D(GL_TEXTURE_2D, 0, 3, 64, 64, 0, GL_RGBA, GL_UNSIGNED_BYTE, row_pointers);

to this

glTexImage2D(GL_TEXTURE_2D, 0,GL_RGBA, 64, 64, 0, GL_RGBA, GL_UNSIGNED_BYTE, row_pointers);

See if this sorts out ur problem.
Regards,
Mobeen

I tried it, I’m still having the same problem. I bet I’m making some bone-headed mistake somewhere (still checking over everything).

Edit: By pretending the image is 66 x 66 when it is definitely 64 x 64, it almost fixed the problem. Now the image is properly displayed but there are still those extra garbage values being drawn. It looks like row_pointers would be the problem, but when I print out all the data, it matches up. It’s, as expected, a 64 x 256 array of bytes with the correct values. And by pretending I mean:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 66, 66, 0, GL_RGBA, GL_UNSIGNED_BYTE, row_pointers);

Could u try this, add this line before the glTexImage2D call,

glPixelStorei(GL_PACK_ALIGNMENT,4);

keep the dimension 64,64.

It did not change anything. In fact, if I put 1, 2 or 4 into that function there is no visible difference.

I dont know if this will sort your problem but are your sure that the data pointer (row_pointers) contains exactly 64x64x4 bytes. Another thing u can try is make sure that the data pointer is filled will zeros doing something like this

memset(row_pointers,0, 64*64*4);

so that the garbage values are removed. Other than that I think your code looks ok. the problem may be elsewhere.

Ah! Thanks so much, that led me to my problem. When I was checking row_pointers I was using row_pointers[i][j] which was dereferencing (thus giving me the expected output). I just needed to copy all those values into a char buffer and it worked great.

I assume you are using libpng to read from your texture file! Is it possible you could show me specifically the libpng code you are using to load the image?

I’m trying to use a cross-platform library since I develop on Arch Linux and I want to use PNG and I can’t for the life of me understand it, I always get a segmentation error when I try to use glTexImage2D.