non power of two textures

I seem to recall a few years back when i had one of those ghetto 9600se cards that it claimed opengl 2.1 support, but then simply didn’t support non power of two textures (one of the core features). It didn’t even export the string. And using it resulted in a cataclysm.

Anyway fast forward somewhat and I am using a ghetto ati x1950 pro (latest drivers). It claims non power of two texture support, however if i use any non power of two textures above 512 in size they come out either black, or totally corrupted. If they come out corrupted, performance seems to be crippled.

For testing I am using a texture size
511x501 and it’s RGB so not aligned on 4 byte boundary. I am using
glPixelStorei (GL_UNPACK_ALIGNMENT, 1);

then
glTexImage2D (GL_TEXTURE_2D,0,GL_RGBA,imgWidth,imgHeight,0,format,GL_UNSIGNED_BYTE,data);

This works fine.

But if i try a larger texture … this happens

What am i doing wrong ?

maybe i should add, this problem doesn’t happen on my nvidia card

Edit:
Sigh …
It looks like an ATI driver bug
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=252427

That’s the exact issue I have. 2 years on and it’s still not fixed? That’s f*cking ridiculous.

Is EXT_texture_rectangle supported by your card ? Because you’re using rectangular textures, and arb_npot and ext_texture_rectangle are two different things.

If not, try non rectangular NPOT textures instead, and give results back.

I need mipmapping so texture_rectangle is a no go, otherwise I would have used it. The problem isn’t actually the non power of 2. I thought it was originally. I found the same problem happens with 2048x2048 textures, even though the gfx card reports it supports something like 8192x8192.

It’s simply there’s a whole generation of ATI cards that can’t use large textures because there’s a bug in their driver … quite amazing really.

It’s simply there’s a whole generation of ATI cards that can’t use large textures because there’s a bug in their driver … quite amazing really.

To be fair though, it is a really old generation. The x**** series is over 5 hardware generations (and about 5 years) old. They stopped support for them about a year ago.

The kind of bugs you’re experiencing with ATI’s OpenGL drivers, particularly from that generation (or before), are not really unusual. I’ve noticed missing entrypoints (hello glPointParameteri), lack of core features (hello blend squaring) and mysterious behaviour with non power of two textures on many occasions in the past. The best approach is to code as defensively as possible, assume absolutely nothing, and work on the basis that if you can’t do it in D3D then you shouldn’t even attempt it in OpenGL (same as for Intel, basically).

One thing you might try in your particular situation is to use an RGBA texture (or - preferably - BGRA) instead of RGB. 3 component textures are really not native to graphics hardware and the driver needs to do some extra work if either format or internalformat are specified as 3 component, so switching to 4 component may help, at least by potentially bypassing a buggy part of the driver. BGRA is preferred because it is fully native to most hardware. Also drop your glPixelStore call for when you do this.

This sucks a little if you’re using an image library that gives you 3 component data. You’ll need to malloc a new buffer, expand your data to 4 component (possibly swapping R and B) and then free it after your glTexImage2D call, but if it does work I guess it’s the way to go.

This sucks a little if you’re using an image library that gives you 3 component data.

yeah libpng :eek:

I couldn’t even get
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);

to work on the card. All the mipmaps come out black. For much larger textures its noticeably slow generating them on the CPU.