texture mapping

I’m trying to learn to use textures using the redbook and the “superbible”; I’ve also been thru the gl game programming wiki and some other tutorials; none of these use exactly the same procedure, but it still seems like it should be DEAD SIMPLE.

Can someone tell me why what I doing does not work? This is my initialization/setup function which loads the texture. The “gltLoadTGA” is from the superbible toolkit and works in the demos there:


void init(float R, float G, float B) {
        GLubyte *tximg;
        GLint wd, hgt, comp;
        GLenum form;
        GLuint Texture;

        glClearColor(R,G,B,1.0f);
    
        glEnable(GL_TEXTURE_2D);

        glGenTextures(1,&Texture);
    
        glPixelStorei(GL_UNPACK_ALIGNMENT, 1);   /* because of tga format */
        tximg = gltLoadTGA("hexpattern.tga",&wd,&hgt,&comp,&form); 
        glTexImage2D(GL_TEXTURE,0,comp,wd,hgt,0,form,GL_UNSIGNED_BYTE,tximg);
        free(tximg);
        
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);

        glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
        glBindTexture(GL_TEXTURE_2D,Texture);   
}

I’ve tried a few different options with TexParameter and TexEnv. Then the scene rendering function:


void scene() {
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
        glColor4f(1.0f,1.0f,1.0f,1.0f);
        glBegin(GL_QUADS);
                glTexCoord2f(0.0f,0.0f); glVertex3f(-10.0f,-10.0f,0.0f);  
                glTexCoord2f(0.0f,1.0f); glVertex3f(-10.0f,10.0f,0.0f);  
                glTexCoord2f(1.0f,1.0f); glVertex3f(10.0f,10.0f,0.0f);  
                glTexCoord2f(1.0f,0.0f); glVertex3f(10.0f,-10.0f,0.0f);
        glEnd();
        gluLookAt(0.0f,0.0f,50.0f,0.0f,0.0f,0.0f,0.0f,1.0f,0.0f);
        glutSwapBuffers();
}

I have no lighting enabled in this because it’s a simplification of the original scene, and no one implies that it is necessary. But no matter what I do, all I end with is a white square.

“If texturing is enabled (and TEXTURE_MIN_FILTER is one that requires a mipmap) at the time a primitive is rasterized and if the set of arrays 0 through n is incomplete, based on the dimensions of array 0, then it is as if texture mapping were disabled.”

By default, texture min filter does require a mipmap.
http://www.opengl.org/resources/features/KilgardTechniques/oglpitfall/

A quick fix to actually see your texture is change the env setting of MIN filter to one that does not require a mipmap :
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );

Ugly, but should work.

Better, let the hardware autogenerate mipmaps whenever it feels the need :
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);

Even better, to trigger manually the generation of mipmaps by the card, use the newer :
glGenerateMipmapEXT(GL_TEXTURE_2D);

That’s in the OP. The second choice did nothing, and the third one actually caused my X server to crash (I don’t have a real graphics card, so I’m using software emulation, which is fine, I’m not a gamer).

oops sorry, these tiny boxes are hard to read.
well the rest seem ok, I am not familiar to gltLoadTGA, what are the values of wd,hgt,comp,form after it is called ?

Have a try with a basic array, to rule out a file or loader problem :
// the texture (2x2)
GLbyte textureData[] = { 128, 128, 128, 255, 0, 0, 0, 255, 0, 0, 0, 255, 255, 255, 0 };
GLsizei width = 2;
GLsizei heigth = 2;
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, width, height, 0, GL_UNSIGNED_BYTE, (GLvoid*)textureData);

Good idea. Same thing. This is the new initialization.


void init(float R, float G, float B) {
        GLuint Texture;
        GLbyte textureData[] = { 128, 128, 128, 255, 0, 0, 0, 255, 0, 0, 0, 255, 255, 255, 0 };
        GLsizei width = 2;
        GLsizei heigth = 2;

        glClearColor(R,G,B,1.0f);
    
        glEnable(GL_TEXTURE_2D);

        glGenTextures(1,&Texture);
    
        glTexImage2D(GL_TEXTURE,0,GL_RGB8,width,heigth,0,GL_RGB,GL_UNSIGNED_BYTE,(GLvoid*)textureData);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
        glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);

        glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
        glBindTexture(GL_TEXTURE_2D,Texture);   
}

There’s an odd number of bytes in textureData – is that right? (Adding an extra zero didn’t make much difference anyway)

Can you take a close look at the TexCoord/Vertex combinations in the OP to make sure I have understood this correctly?

Ithink it should be:

GL<u>u</u>byte textureData[]

Try moving “glBindTexture” to immediately after “glGenTextures”.
glTexImage2D, glTexParameteri, and the like are sensitive to the current texture, which in your case is undefined.
glBindTexture is kind of like glMatrixMode – if it’s not the very first thing you do, you should known why not. :wink: