glTexImage2D crashes with SIGSEGV

Hi,

I tried to texture a cube using FreeImage. However, each time I call glTexImage2D the program crashes.

I (up)load the Texture with the following code:



                GLuint TextureHandle = 0;
	        FIBITMAP* Image = 0;
	        int ImageWidth = 0;
 	        int ImageHeight = 0;
                FIBITMAP* tmp;
	        FREE_IMAGE_FORMAT Format;
                const char *File = "~/Bilder/lavaPathTraceTest.png"

                FreeImage_Initialize();

                Format= FreeImage_GetFileType(File,0);
		tmp = FreeImage_Load(Format,File);

		Image = FreeImage_ConvertTo32Bits(tmp);
		FreeImage_Unload(tmp);

		ImageWidth = FreeImage_GetWidth(Image);
		ImageHeight = FreeImage_GetHeight(Image);

	if(TextureHandle == 0){
		glGenTextures(1,&TextureHandle);
		checkOpenGLErrors();
	}

        //this works perfectly
	//FreeImage_Save(Format,Image,"/home/alex/Bilder/FreeImageSave",0);

	glBindTexture(GL_TEXTURE_2D,TextureHandle);
	checkOpenGLErrors();
	glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,getTextureFilter(MagFilter));
	checkOpenGLErrors();
	glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,getTextureFilter(MinFilter));
	checkOpenGLErrors();
        //this crashes with SIGSEGV
	glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,ImageWidth,ImageHeight,
				 0,GL_BGRA,GL_UNSIGNED_BYTE,(GLvoid *) Image);
	checkOpenGLErrors();

Saving the image works perfectly and non of the checkOpenGLErrors calls reports an error.

The texture is a 4096x4096 png image.
glGetIntegerv(GL_MAX_TEXTURE_SIZE) reports 16384.

I’m using an OpenGL 4.0 forward compatible context with a GTS450 on Ubuntu 12.10.

Thanks in advance,
Apoptose

glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,ImageWidth,ImageHeight,
                 0,GL_BGRA,GL_UNSIGNED_BYTE,(GLvoid *) Image);

you passing pointer to freeimage image class, not actual data. you should use FreeImage_GetBits to get pointer to rgb data.
image loading - OpenGL: Basic Coding - Khronos Forums - example

p.s. :

GLuint TextureHandle = 0;

i’d replace it with

GLuint TextureHandle = -1;

and

if(TextureHandle == 0)

with

if(TextureHandle == -1)

because glGenTextures can potentially generate texture object with 0 id;

Thanks Nowhere-01,

I actually used that example, but didn’t notice the FreeImage_GetBits call.

And thanks for that tip! (Although TextureHandle then needs to be a GLint)

Apoptose

ok, i screwed up a bit, glGenTextures returns 0 in case if something gone wrong. but i still wouldn’t use it as initialization indicator. and you can set unsigned variables to -1. it will actually be maximum unsigned int value, but comparing it to -1 is correct.

Ok, then I’ll leave it as 0.

I actually worked around comparing to -1 by using TextureHandle < 0,since Gcc didn’t let me do it :smiley:

this is not correct. you can assign negative value to unsigned int and compare it to exact number.

but:

comparison of unsigned expression < 0 is always false

– g++ compiler


unsigned f = -1;
if(f == -1) {
    //true
}

if(f < 0) {
    //always false
}

if(f >= 0) {
    //always true
}


[QUOTE=Nowhere-01;1248005]this is not correct. you can assign negative value to unsigned int and compare it to exact number.
[/QUOTE]

Just tried it. With f == -1 g++ complained about -Wsign-compare.

After adding a cout statement I recognized that g < 0 was false. I just thought since the texturing Code worked, it was true :slight_smile:

yes it gives you a warning, but it is a well defined behavior:
http://liveworkspace.org/code/2GRZ6E$3

but if you don’t want warnings or find using -1 uncomfortable, you can create separate variable to indicate if texture object is initialized. or you can use 0, but it’s not reliable;