Texture mapping using a DDS image

I am trying to texture map using a DDS image and the nvImage.h Nvidia file in my game. I have used the exact same code (got it from a tutorial) before and it worked but when I include it in my game the texture doesn’t load. The main code where I think the problem is, is shown below:

nv::Image img;
if(img.loadImageFromFile("lect.dss"))
	{
		glGenTextures(1, &myTexture);
		glBindTexture(GL_TEXTURE_2D, myTexture);
		glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);
		glTexImage2D(GL_TEXTURE_2D, 0, img.getInternalFormat(), img.getWidth(), img.getHeight(), 0, img.getFormat(), img.getType(), img.getLevel(0));
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
		glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
		glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY_EXT, 16.0f);
	}

	else
		MessageBox(NULL, "Texture Lect is not found", "Error", MB_OK | MB_ICONINFORMATION);

Anyone know what I am doing wrong? I have been having problems with texture mapping for a while so it could be something really obvious.

Assuming you’re using DXT1 (why else DDS), replace all those gl calls with:


glEnable(GL_TEXTURE);
glGenTextures(1, &myTexture);
glBindTexture(GL_TEXTURE_2D, myTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY_EXT, 16.0f);
glCompressedTexImage2D(GL_TEXTURE_2D,0,GL_COMPRESSED_RGB_S3TC_DXT1_EXT,wid,hei,0,nSize,data);

Thanks, but what type do I need to define data as? I tried:
const GLvoid data;

But loads of errors come up.

IMO, the best in an array of unsigned byte (GLubyte) since nSize in the above code is the unsigned byte count in the image.

Still getting an error:
1>.\game1.cpp(192) : error C2664: ‘void (GLenum,GLint,GLenum,GLsizei,GLsizei,GLint,GLsizei,const GLvoid *)’ : cannot convert parameter 8 from ‘GLubyte’ to ‘const GLvoid *’

Any idea? It is probably something really simple but I haven’t done enough to see it yet.

I have managed to repair the damage to the original code and it now loads a texture, but when I try different textures they show up just as white. Is there anything about .dds textures that I need to make sure is right. I even opened the texture that was working with Paint.net and drew a line over and it shows up as white again.

1>.\game1.cpp(192) : error C2664: ‘void (GLenum,GLint,GLenum,GLsizei,GLsizei,GLint,GLsizei,const GLvoid *)’ : cannot convert parameter 8 from ‘GLubyte’ to 'const GLvoid *

Clearly, you don’t provide an unsigned byte array to glCompressedTexImage2D. You have to give the texture address whose type should be GLubyte* since it is a pointer.