Texture not working

A 32x32 pixel PNG that I’m loading through LodePNG is either not being loaded into a texture correctly, or something is wrong with my code to keep it from being applied to anything.

I’ve checked the image data before sending it to OpenGL and it seems to be valid data, the decoder doesn’t return any errors and I see a bunch of 0-255 values. I never get anything besides GL_NO_ERROR from glGetError() during runtime.

Here’s my image loading and binding function. Texture is a class that has an ID (uint), width and height.

Texture::Texture(string pngPath)
{
	string path = getWorkDir() + pngPath;
	const char* filename = path.c_str();
	println("Loading '" + path + "'...");

	std::vector< unsigned char > rawImage;
	LodePNG::loadFile( rawImage, filename );

	LodePNG::Decoder decoder;
	std::vector< unsigned char > image;
	decoder.decode( image, rawImage.empty() ? 0 : &rawImage[0], (unsigned)rawImage.size() );

	if(decoder.hasError()) 
        println("PNGDecoder: " + toString(decoder.getError()) + ": " + LodePNG_error_text(decoder.getError()));
	
	//
	// Flip and invert the PNG image since OpenGL likes to load everything
	// backwards from what is considered normal!
	//

	unsigned char *imagePtr = &image[0];
	int halfTheHeightInPixels = decoder.getHeight() / 2;
	int heightInPixels = decoder.getHeight();

	// Assuming RGBA for 4 components per pixel.
	int numColorComponents = 4;

	// Assuming each color component is an unsigned char.
	int widthInChars = decoder.getWidth() * numColorComponents;

	unsigned char *top = NULL;
	unsigned char *bottom = NULL;
	unsigned char temp = 0;

	for( int h = 0; h < halfTheHeightInPixels; ++h )
	{
		top = imagePtr + h * widthInChars;
		bottom = imagePtr + (heightInPixels - h - 1) * widthInChars;

		for( int w = 0; w < widthInChars; ++w )
		{
			// Swap the chars around.
			temp = *top;
			*top = *bottom;
			*bottom = temp;

			++top;
			++bottom;
		}
	}
	
	//
	// Create the OpenGL texture and fill it with our PNG image.
	//
	width = decoder.getWidth();
	height = decoder.getHeight();
	// Allocates one texture handle
	glGenTextures(1, &ID);

	// Binds this texture handle so we can load the data into it
	glBindTexture(GL_TEXTURE_2D, ID);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
	glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
	glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

	glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, &image[0]);
	
	println("loaded texture with id " + toString(ID) + " width: " + toString(width) + " height: " + toString(height));
}

void Texture::bind()
{
	glBindTexture(GL_TEXTURE_2D, ID);
}

And for displaying the image, I use this function. You can ignore the u/vScale vars (those aren’t doing anything yet).

void drawImage(int x, int y, Texture image)
{
	float uScale = 1;
	float vScale = 1;
	int w = image.width;
	int h = image.height;
		
	glEnable(GL_TEXTURE_2D);
		
	glColor3f(1.0f,1.0f,1.0f); // draw with no tint (white)
	image.bind();

	glBegin(GL_QUADS);
		glTexCoord2f(0,0);
		glVertex2i(x, y);
			
		glTexCoord2f(1*uScale,0);
		glVertex2i(x + w, y);
			
		glTexCoord2f(1*uScale,1*vScale);
		glVertex2i(x + w, y + h);
			
		glTexCoord2f(0,1*vScale);
		glVertex2i(x, y + h);
	glEnd();
		
	glDisable(GL_TEXTURE_2D);
}

If there isn’t anything wrong with this code, then is there some setting that would prevent textures from working (apart from TEXTURE_2D being off, which it isn’t)? My porgram is pretty basic, all I do is draw a few primitives followed by this image, which shows up as a white rectangle.

EDIT: Also, here’s my opengl init function:

bool InitGL(GLvoid)                              // All Setup For OpenGL Goes Here
{
	glClearColor(0.0f, 0.0f, 0.0f, 0.0f);                   // Black Background
	glDisable(GL_DEPTH_TEST);						  

	// enable alpha blending
    //glEnable(GL_BLEND);   
	//glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

	// transparent images
	glAlphaFunc(GL_GREATER, 0.1f);
	glEnable(GL_ALPHA_TEST);

	glEnable(GL_TEXTURE_2D);
	glDisable(GL_LIGHTING);
	glDisable(GL_BLEND);
	glDisable(GL_COLOR_MATERIAL);
    return true;                                // Initialization Went OK
}

Ok, so I think I’ve found the problem. Something is going wrong when I upload the pixels to OpenGL. I plugged in the same image loading code into a program that has working textured polygons and I got white squares, just like in my program.

This doesn’t make any sense to me, though, since I know that the image data that i’m passing isn’t pure white or empty. Maybe there’s something wrong with using ‘&image[0]’ as the pointer to the image data?

I’ve read that RBGA is the only format LodePNG outputs so the type should be correct. Width and height are set correctly. Is it possible that using unsigned chars instead of unsigned bytes would be a problem? They both hold values 0-255 so i doubt it.

Could u enable texture 2D when you generate the texture in the Texture constructor.


// Allocates one texture handle
glGenTextures(1, &ID);
glEnable(GL_TEXTURE_2D); //enable texturing then bind the texture
// Binds this texture handle so we can load the data into it
glBindTexture(GL_TEXTURE_2D, ID);

GL_TEXTURE_2D is enabled in the init function when the program starts. I tried putting it there anyway and still got the white box.

I’m about to give up on this, use the SOIL library, and rewrite my program.

Oh wow, I had a feeling the answer would induce a facepalm. For some reason when i create a texture object with:

Texture tex("test.png");

instead of:

Texture* tex = new Texture("test.png");

I’m not sure why this is, so I guess I should read up a little more on the basics of C++…