md2 models with sdl textures

I’m trying to load a md2 model using SDL for textures. I’ve read and seen this done before but all the tutorials and links are dead. So, I’ve made my own code.

Sadly, it doesn’t work. Because my problem is that my md2 model loads fine but it doesn’t texture! I thought I’ve set up everything right but I can’t figure it out?

I’ve narrowed it down to the line gluBuild2DMipmaps, which crashes the program if I leave it uncommented.

I’ve been trying to figure this out for a while now and I can’t put it off much longer (its hard to make a game enger without 3d models)! I’m posting because I know a pro opengl/sdl guru could take one look at this and know what’s going wrong.

Thanks for taking a look.

class CMD2Model
{
public:

etc.  etc.

private:
  

  modelHeader_t m_info;
  GLuint finaltexture[1];
  
  GLfloat texcoord[4];
  SDL_Surface *md2image;

  mesh_t *m_tris;           // triangle list
  texCoord_t *m_texCoords;  // texture coordinate list
  xyz_t *m_verts;           // vertex list
  xyz_t *m_currentVerts;    // working vertex list

  GLfloat m_miny;
  xyz_t m_pos;
  GLfloat m_rotate;
};

bool CMD2Model::Load(char *modelFile, char *skinFile, GLfloat scale)
{
  // open the model file
  etc. etc.
  
  // texture the model
  md2image = load_image(skinFile);
  finaltexture[0] = SDL_GL_LoadTexture( md2image, texcoord );
  glBindTexture(GL_TEXTURE_2D, finaltexture[0]);
  gluBuild2DMipmaps(GL_TEXTURE_2D, 3, md2image->w, md2image->h, GL_RGB, GL_UNSIGNED_BYTE, md2image->pixels);
  SDL_FreeSurface( md2image );

  m_currentFrame = 0;
  m_nextFrame = 1;
  m_interpol = 0.0;

  // animate at least one frame to make sure that m_currentVerts gets initialized
  SetAnimation(IDLE);
  Animate(0);
  
  Log("%s loaded succesfully!
", modelFile );
  return TRUE;
}
void CMD2Model::Render()
{
  glPushMatrix();
  glTranslatef(m_pos.x, m_pos.y, m_pos.z);
  glRotatef(m_rotate, 0, 1, 0);

  glBindTexture(GL_TEXTURE_2D, finaltexture[0]);
  glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE);
  glEnableClientState(GL_VERTEX_ARRAY);
  glEnableClientState(GL_TEXTURE_COORD_ARRAY);

  glVertexPointer(3, GL_FLOAT, 0, m_currentVerts);
  glTexCoordPointer(2, GL_FLOAT, 0, m_texCoords);

  glDrawArrays(GL_TRIANGLES, 0, m_info.numTris * 3);
  
  glDisableClientState(GL_TEXTURE_COORD_ARRAY);
  glDisableClientState(GL_VERTEX_ARRAY);
  glPopMatrix();
}

Perhaps useful:

Sadly, it doesn’t work. … I’ve narrowed it down to the line gluBuild2DMipmaps, which crashes the program if I leave it uncommented.

Sounds like you’ve got memory problems. Time to pull out valgrind, Purify, or your favorite memory debugger.

gluBuild2DMipmaps(GL_TEXTURE_2D, 3, md2image->w, md2image->h, GL_RGB, GL_UNSIGNED_BYTE, md2image->pixels);

I doubt this is it, but use an explicit internal format such as GL_RGB8 rather than just “3”. Though yes, 3 should work. Also, are your textures power-of-two? gluBuild2DMipmaps will resize them if not.

BTW, once you figure out the bug, consider instead use of glGenerateMipmap or GL_GENERATE_MIPMAP instead. However see this thread and this thread first if you need to operate on ATI.

Thank you for responding to my post. My image for my model is 512x512 so I don’t think that’s a problem. I will look into debugging my app though.

Out of curiosity, what might I look for in trying to find the gluBuild2DMipmaps memory leak?

Also, if I comment the line and compile my app, the model still doesn’t texture. I’m new to opengl (working with it about a month) and I’m not sure why mipmapping is necessary to get your model textured. I still used glBindTexture in my rendering function so I thought the model would texture anyway? Might it be something I did or disabled in an earlier part of my program that’s keeping the textures off it?

Sorry for the questions but I’m trying to absorb all the knowledge I can! :o

Thank you
scarypajamas

Who said it was a leak? :wink: That wouldn’t crash your app. I said memory problem. Things like ABRs (array bounds writes – i.e. writing past the end of an array) are the kind of things I’m talking about.

I’m new to opengl (working with it about a month) and I’m not sure why mipmapping is necessary to get your model textured.

It’s not required. You can turn off MIPmapping, not provide MIPmaps, and your model will render. However, it will shimmer like crazy when the texture is minified (i.e. displayed at less than 1 texel per pixel resolution). This isn’t an OpenGL thing. It’s a real-time graphics thing.

The fundamental problem is the GPU doesn’t have time to go add up hundreds or thousands of texels in a texture and take the average for many pixel on the display. So we provide MIPmaps to the GPU so it can do those integration computations <u>very quickly</u>. Essentially, we pre-average (pre-integrate) areas of the texture for it.

Graphics is all about integration, and this is just one place where it comes up.

I still used glBindTexture in my rendering function so I thought the model would texture anyway? Might it be something I did or disabled in an earlier part of my program that’s keeping the textures off it?

You’ll have to change your texture min filter so it doesn’t use MIPmaps. E.g.:

glTexParameteri( gl_target, GL_TEXTURE_MIN_FILTER, GL_LINEAR ) ;

You could use GL_NEAREST instead as well. Main thing is if you don’t provide MIPmaps, don’t try to use a minification filter that requires MIPmaps (e.g. GL_LINEAR_MIPMAP_LINEAR).

Sorry for the questions but I’m trying to absorb all the knowledge I can! :o

Hey, no problem! That’s where were all start! Better to ask questions than to get frustrated and go write poetry.

Hey! Thanks for responding again Dark Photon! :smiley: I really appritate your answers and I finnaly figured out why my model wouldn’t texture. Apperently I was trying to optimize and mask all the textures I was loading with my load_image() function. I guess gluBuild2DMipmaps crashed when it came to time to work with optimized images.

I also used your suggested glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR) to texture the model.

This SDL function is the one that works for loading images:

SDL_Surface *load_image( std::string filename ) 
{
	//We need to ckeck if you've all ready loaded this image and then
	//give the preloaded user the same image again.
/*	for( int t=0; t<1024; t++ )
	{
		if( solid_texture[t] = filename ) //find a texture that matches
		{
			return solid_texture[t];
		}
	}

	//Store the name into memmory
	solid_texture[st] = filename;
	st += 1;*/

    //The image that's loaded
    SDL_Surface* loadedImage = NULL;
    
    //The optimized surface that will be used
    SDL_Surface* optimizedImage = NULL;
    
    //Load the image
    loadedImage = IMG_Load( filename.c_str() );
        
    //Return the optimized surface
    return loadedImage;
}

This function is what was screwing up gluBuild2DMipmaps. It didn’t like I was trying to optimize it:


SDL_Surface *load_image( std::string filename ) 
{
	//We need to ckeck if you've all ready loaded this image and then
	//give the preloaded user the same image again.
/*	for( int t=0; t<1024; t++ )
	{
		if( solid_texture[t] = filename ) //find a texture that matches
		{
			return solid_texture[t];
		}
	}

	//Store the name into memmory
	solid_texture[st] = filename;
	st += 1;*/

    //The image that's loaded
    SDL_Surface* loadedImage = NULL;
    
    //The optimized surface that will be used
    SDL_Surface* optimizedImage = NULL;
    
    //Load the image
    loadedImage = IMG_Load( filename.c_str() );
    
    //If the image loaded
    if( loadedImage != NULL )
    {
        //Create an optimized surface
        optimizedImage = SDL_DisplayFormat( loadedImage );
        
        //Free the old surface
        SDL_FreeSurface( loadedImage );
        
        //If the surface was optimized
        if( optimizedImage != NULL )
        {
            //Color key surface
            SDL_SetColorKey( optimizedImage, SDL_RLEACCEL | SDL_SRCCOLORKEY, SDL_MapRGB( optimizedImage->format, 0, 0xFF, 0xFF ) );
        }
    }
    
    //Return the optimized surface
    return optimizedImage;
}

Thanks for your help Dark Photon
scarypajamas

Yeah, that makes sense. Looks like this says to chroma key the image (select one color as transparent, and RLE compress the transparent pixels). Consequently, the pixel data you end up with is a data block in a format that OpenGL and its libraries does not understand.

So basically you were telling gluBuild2DMipmaps the data was in a format that it wasn’t, as a result it probably plowed past the end of the pixel array, and crashed the application.

Hi,

I’m using also this code (I guess this is the code from the “Beginning OpenGL Game Programming” book :wink: ) …

My problem is nearly the same: The md2 model is drawn but there’s no texture on it.
The only difference is that I’m trying to run the md2 modelloader + the tga imageloader from the book with GLFW, but I don’t have any idea, why the texture isn’t drawn:

Here’s the code I put into the md2.cpp in the CMD2Model::Load(…) method (nearly at the end of the method):

...
GLFWimage image2;
  if(glfwReadImage(skinFile, &image2, GLFW_ORIGIN_UL_BIT) != GL_TRUE){
		return 0;
  }
  glGenTextures(1, &m_texID);
  glBindTexture(GL_TEXTURE_2D, m_texID);
  glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
  
  glTexImage2D(GL_TEXTURE_2D, 0, image2.Format, image2.Width, image2.Height,
               0, image2.Format, GL_UNSIGNED_BYTE, reinterpret_cast<void*>(image2.Data));
  glfwFreeImage(&image2);

....

I would be very happy if someone could help me on this issue!

problem solved - had nothing to do with glfw - I just initialized the models in the wrong order