TGA Texture Problem

Hi guys,
I have written a TGA loader and am trying to get textures to load from it. I am entirely confident that the loader I have written is working correctly, I have verified the data byte for byte.
The problem is that when I go to load a texture from an image I get very strange results. It looks as though the data is off by one byte per row of pixels. Here is an image to show you what I mean. The top is what should display, below the white line is what DOES display. (the white line is there for your viewing pleasure, it isn’t part of either image)

The code I use to change my image data into a texture is the following function:

GLuint texFromTarga(targa tga)
{

GLuint r;

if(!tga.safeChk)

	return 0;

glGenTextures(1, &r);

glBindTexture(GL_TEXTURE_2D, r);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST);

gluBuild2DMipmaps(GL_TEXTURE_2D, tga.chn, tga.w, tga.h, tga.pxTyp, GL_UNSIGNED_BYTE, tga.imgDat);

free(tga.imgDat);

return r;

}

EDIT: tga.safeChk is a bool that is set to false if there is some error loading the file.

tga.chn holds (bpp / 8), meaning the number of bytes per pixel.

tga.w is the width of the image.
tga.h is the height of the image.
tga.pxTyp is GL_BGR for 24-bit images and GL_BGRA for 32-bit images. (I have tried using RGB and RGBA, it makes no difference, except that blue and red are reversed obviously)

PLEASE don’t tell me to google a good TGA loader or link me to NeHe. I know my loader works, and I am pretty confident that the problem is some poorly set parameter in openGL code which I would like to A: find, and B: UNDERSTAND. I am guessing it’s in the above function, but I am pretty lost and it could be anywhere.

Another important bit I forgot to mention, this works just fine with 32-bit images (i.e. ones w/ an alpha channel), it only fails with the 24-bit files. Just so that we’re clear, the rainbow effect only takes place on a per-row basis, meaning that I am NOT reading an incorrect number of bytes per pixel, the error only takes place at the end of each row (by all appearances). If I were reading BGR as BGRA I would be getting fancy rainbows and transparencies on every pixel.

Thank you very much in advance to anyone who has any idea what’s going on here.

Ok, nobody stab me for this, but if I make my textures with power-of-two dimensions they don’t do this. I was under the impression that this didn’t matter with gluBuild2DMipmaps, I’m also highly confused as to how this might have caused my problem.
Am I actually solving this by just resizing my textures? Or am I coincidentally hiding some deeper problem?

glPixelStorei(GL_PACK_ALIGNMENT,1);
glPixelStorei(GL_UNPACK_ALIGNMENT,1);

Try to avoid gluBuild2DMipmaps, as it internally will resize+smudge the NPOT texture to POT. You can make the gpu create mipmaps for you in 2 ways (via a deprecated tex-param: GL_GENERATE_MIPMAP, or via a modern extension: glGenerateMipmap() ).

P.S. BTW, there’s a general requirement in Win32/bmp that scanlines of image-data start at DWORD-aligned address.

that effect can only be had if you where loosing like a byte or two per line, so it’s in your loading code, probably in how you load the last few pixels per line, so do go over your code several times.

I actually had pretty much the same problem, then the problem was in a loop that counted all the pixels on each line.