PDA

View Full Version : Correct glTexImage2D usage -ATI bug?



sqrt[-1]
02-26-2003, 01:14 AM
What is the correct result from passing GL_RGB (or 3) as the internalFormat parameter in glTexImage2D?

I pass in 8-bit RGB data (with GL_RGB as the internal format) when generating a normalization cube-map and the resulting cube map has horrible banding artifacts. (see the ATI demo at http://www.ati.com/developer/sdk/RadeonSDK/Html/Samples/OpenGL/RADEONCombine3SpecMap.html for a similar artifact)

Now if I pass RGB8 to specify the internal format in glTexImage2D the banding disappears.

(In the ATI demo, change the six lines at line no. ~938 in RadeonCombiner3SpecMap.cpp
glTexImage2D(...., 0, 3,....) to
glTexImage2D(...., 0, GL_RGB8,....)

and the banding artifacts disappear also)

Now I thought that when you specify GL_RGB (or 3) that the driver was to select the most appropiate internal format for the data passed in (as long as it had R,G and B components)

So is this a ATI driver bug or am I reading the spec incorrectly?

Ysaneya
02-26-2003, 01:37 AM
Remember, it's not a bug... it's a "feature".

Seriously, i don't see where the problem is. When you specifiy GL_RGB or 3 as the internal format, you basically say to the driver "i don't care". And when the driver don't cares, i'm guessing it's using 16-bits textures to reduce the texture memory usage and increase the performances. I might be wrong, but i think it's the same on NVidia's.

Y.

kehziah
02-26-2003, 04:57 AM
I guess that's where the wonderful display properties panel comes in : the "texture quality" slider could control how the driver stores textures that are specified with a loose format (ie a format that does not specify a bit count per chanel).

V-man
02-26-2003, 06:35 AM
It might be possible to query the internal format,

glGetTexLevelParameterfv(bla bla bla)

not sure if it returns the actual internal format but try it.

zed
02-26-2003, 11:26 AM
perhaps they're using R5G6B5 (or RGB5) textures when u call it with RGB.
this is correct behaviour! (they only have to be close)
nvidia cards will also do the same if you have a 16bit window ie give u RGB5 if u ask for RGB

sqrt[-1]
02-26-2003, 02:14 PM
OK, I'm convinced, I'll specify the bit-precision of internal formats that I need from now on.
(It just seemed strange that ATI have a demo that does not work correctly with default settings - Guess I'm just letting off steam as this "bug" took forever to track down)