Nvidia texture compression + pbuffer bug?

I am experiencing a peculiar problem. I am using compressed textures in the RGB_S3TC_DXT1 format (code 0x83f0), and they work fine.

Now I want to use my scene as a texture, and
render it to a pbuffer. I activate texture sharing with my original rendering context and all is fine, except: my RGB_S3TC_DXT1 textures behave as they were RGBA_S3TC_DXT1 textures. This makes black areas in them alpha transparent. I have done glGetTexLevelParameteriv(GL_TEXTURE_INTERNAL_FORMAT)
and it returns the correct code, so that has not changed, but the textures do behave incorrectly.

I am using GeForce FX 5900 on Windows XP with the 44.03 drivers, I suspect that this problem did not exist on my GeForce (1) DDR, but I cannot be sure. (Fullscreen pbuffer, render to texture stuff didn’t really fly on that card…)

I can work around this problem, but I am wondering if anybody has seen this before?

Also, is the Nvidia DeveloperRelations email responsive? (for non registered developer) Making a test case might require some work and I wouldn’t want to do that if they are not interested in stuff like this.

   Eero

I believe the problem is one in TexEnv or blending modes, rather than the texture format. TexEnv settings and blending modes are not shared across contexts; TexParameter and TexImage are.

Note that it’s extremely unlikely that RGB DXT1 would turn into RGBA. Internally, RGB and RGBA use THE SAME compressed format. The de-compressor looks at the two palette entries for the block, and if the first is numerically greater than the second, then the four values of each pixel in the block consist of the two colors, their average, and a black/transparent color. If the first is numerically smaller, then the four pixel values consist of the two colors, and interpolated colors 1/2 and 2/3 between them. The fact that you specify RGB or RGBA is more an attempt to make this special case fit into the OpenGL model; the hardware doesn’t care.

And, specifically, it is not true that black is always transparent in DXT1; however, it IS true that transparent is always black in DXT1.

Originally posted by jwatte:
I believe the problem is one in TexEnv or blending modes, rather than the texture format. TexEnv settings and blending modes are not shared across contexts; TexParameter and TexImage are.

If I have ALPHA_TEST active I get holes in my textured polygon. The holes match the location of black portion.

This happens inside a single large polygon,
so the only reason I can imagine is a texture related source which generates “alpha information”. I am not using texture environment or blending related extensions (nor fragment programs), so I assume that my RGB does not affect alpha.

[b]
Note that it’s extremely unlikely that RGB DXT1 would turn into RGBA. Internally, RGB and RGBA use THE SAME compressed format. The de-compressor looks at the two palette entries for the block, and if the first is numerically greater than the second, then the four values of each pixel in the block consist of the two colors, their average, and a black/transparent color. If the first is numerically smaller, then the four pixel values consist of the two colors, and interpolated colors 1/2 and 2/3 between them. The fact that you specify RGB or RGBA is more an attempt to make this special case fit into the OpenGL model; the hardware doesn’t care.

And, specifically, it is not true that black is always transparent in DXT1; however, it IS true that transparent is always black in DXT1.[/b]

(I have written my texture compression code myself, so I sort of know how it behaves, although I might have started to forget something…)

RGB_DXT1 has a specific bit pattern for black, so it is pretty certain my compressor uses it if it needs black color. RGBA_DXT1 behaves exactly as RGBA_DXT1, except that
it sets alpha = 0 for that same bit pattern
and = 1 otherwise.

So I still think the symptoms look like the hardware for some reason uses my RGB_DXT1 texture as RGBA_DXT1 texture. Because of the similarity of the formats the effect is not more chaotic. Also because of the similarity, I wouldn’t be surprised if this would be implemented with a single config bit somewhere, maybe there is something wrong with handling that bit when sharing textures across rendering contexts.

(BTW. unlike you write the hardware must care about the difference, because the alpha value from the texture stage should be different with these two formats. And obviously it is
except on my PBuffer case, which is my problem)

Of course it is possible that my problem is caused by some bad pointer overwriting memory used by the Nvidia driver, I can only be sure of that after I have made a suitably small test case

Eero

[This message has been edited by epajarre (edited 08-03-2003).]