View Full Version : Texture Problem

06-24-2002, 02:46 PM
I am having a problem displaying my texture using a quad. If I draw it using glDrawPixels, its ok. I would rather use a quad since it is more flexible. The pixel data is stored in an array as unsigned chars.

Below is the code snippets. Can anyone tell me what I am doing wrong? Thanks!

// generate a 2d texture image from an arary
pBitmap->w = (int)array[0]; // first 2 entries hold width & height.
pBitmap->h = (int)array[1];
pBitmap->pixels = &array[2];
glGenTextures( 1, &(pBitmap->texId) );
glBindTexture(GL_TEXTURE_2D, pBitmap->texId );
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, pBitmap->w, pBitmap->h, 0, GL_RGB, GL_UNSIGNED_BYTE, pBitmap->pixels );

// draw a texture (2d) x axis acoss, y axis down

#if 1

glRasterPos2f( (float)x+.5, (float)y + (float)pBitmap->h + .5);
glPixelStorei( GL_UNPACK_ALIGNMENT, 1 );
glDrawPixels( pBitmap->w, pBitmap->h, GL_RGB, GL_UNSIGNED_BYTE, pBitmap->pixels );


glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, pBitmap->texId );

glColor4f( 1.0f, 1.0f, 1.0f, 1.0f );
glBegin( GL_QUADS);

// Bottom Left
glTexCoord2f( 0, 0 );
glVertex2i( x, y );
// Top left
glTexCoord2f( 0, 1 );
glVertex2i( x, y+h );
// Top right
glTexCoord2f( 1, 1 );
glVertex2i( x+w, y+h );
// Bottom right
glTexCoord2f( 1 , 0 );
glVertex2i( x+w, y );
glDisable( GL_TEXTURE_2D );


06-25-2002, 03:14 AM
Can you see the Quad ok?

06-25-2002, 08:11 AM
yes. I get a white quad. At first, i thought it was back face culling. But, I turned back face culling on & the quad is still there. Then, I thought it was the vertex order, so I tried changing it around. Nothing seems to get it to work.... http://www.opengl.org/discussion_boards/ubb/frown.gif
I assume it is something I am doing wrong in the texture setup phase.

06-25-2002, 08:36 AM
Does your texture have dimensions that are a power of 2? ie. 32x32, 64x32, 128x128, etc..

If not, then that is your problem. http://www.opengl.org/discussion_boards/ubb/smile.gif

06-25-2002, 09:01 AM
The textures are powers of 2. ( 128x128 & 64x64 )
All the gl commands are not returning any errors.
The texture ID's seem to be ok.
Are there some glEnable or disable commands I can try turning on/off that might help determine what is going on?

06-26-2002, 03:31 AM
Have you tried using build2dmipmaps to create your trxture?

06-26-2002, 05:06 AM
Are you creating the textures after you've initialized the window to use OpenGL?

Jeffry J Brickley
06-26-2002, 05:15 AM
What are the physical sizes of your textures?

06-26-2002, 05:23 AM
Im not sure what you are doing wrong, but the above code would be much much simpler using the library "devIL" you can find it at www.imagelib.org (http://www.imagelib.org) its opengl syntax and very intuitive.

Here is an example of loading an image into OpenGL
ILuint Image;

ilGenImages(1, &Image);
ilDeleteImages(1, &Image);
glBindTexture (GL_TEXTURE_2D, Image);

and bang your done. Now everything is bound, gaged, and set for opengl. That easy. you can also build MipMaps this way, with a call to ilBindMipmaps(image); instead of ilBindImage, and boom your mipmaps are done. I would recomend this way. But as I stated before, no idea what your current problem is.

06-26-2002, 07:41 AM
Pops - Nope. How would this help?
Deiussum - Yes. There are no reported opengl errors.
Jeffry - 128x128 & 64x64.

I am going to set this on the back burner for now & come back to it in a couple of days once I finished fleshing out some other sections of code.

Thanks for the help. http://www.opengl.org/discussion_boards/ubb/smile.gif

06-26-2002, 01:10 PM
You may want to make sure the texture matrix is the identity, and that the clamping modes are set to clamp (or try wrap too).

06-27-2002, 08:45 AM
Maybe you have waxx0r wrapping/env settings. Add these right after your other glTexParameter calls:


And then add this somewhere (preferably during initialization) once just for convenience's sake--if you want modulation later add it in but for now just to get it going:


If there was one error I see more often it's people calling this with a first argument of GL_TEXTURE_2D. NO! The only valid argument here is GL_TEXTURE_ENV. Make sure you don't have that, it'll shoot you in the foot from the getgo.