DevIL problems.

Hello, I have a new problem :S
I use the DevIL to load my images and then to manually get the ImageData (ilGetData) to create my texture using glGenTextures and glTexImage2D. The image and textures are created ok, but when I try to glBindTexture with the textureID of the class, nothing happens. I use the lesson5 code from Nehe to make a test quad and use glTexCoord2f as described there. I do enable the texture_2d flag with glEnable. The image I am loading a a 128x128 bmp (btw I crash when I run the program, but in Debug in doesnt crash at all, weird??)

All I see, in the corners of the cube, part of the texture, like small dots in every corner. Any ideas??

Nm, I just found it!!!

Bah, I thought I found everything, but there is a bug here. When I load non power of 2 BMP images, my applciation crashes. The thing is that the model (MD2) supplied with the sample at animating an MD2 at GameTutorials.com has a BMP file of non power of 2 and it works! Could it be devIL library? OpenGL has to use only power of 2 textures?

Thanks

OpenGL requires texture dimensions to be powers of 2, however I have never heard of a program crashing upon an attempt to use something else.

Originally posted by kaysoft:
[b]Bah, I thought I found everything, but there is a bug here. When I load non power of 2 BMP images, my applciation crashes. The thing is that the model (MD2) supplied with the sample at animating an MD2 at GameTutorials.com has a BMP file of non power of 2 and it works! Could it be devIL library? OpenGL has to use only power of 2 textures?

Thanks[/b]

I dont know why devil crashes. It can read any image size AFAIK. Too bad it is no longer supported by the author.

OpenGL can take non power of 2 textures, but only with the nvidia_rectangle extension.

It’s always better to scale iot to power of 2 for standard GL.

V-man

Ok, thanks guys.
But what if the user doesnt have a nvidia card? is there a way to support non power of 2 texture sizes with ARB extensions?