Originally posted by jwatte:
I believe the problem is one in TexEnv or blending modes, rather than the texture format. TexEnv settings and blending modes are not shared across contexts; TexParameter and TexImage are.
If I have ALPHA_TEST active I get holes in my textured polygon. The holes match the location of black portion.
This happens inside a single large polygon,
so the only reason I can imagine is a texture related source which generates “alpha information”. I am not using texture environment or blending related extensions (nor fragment programs), so I assume that my RGB does not affect alpha.
[b]
Note that it’s extremely unlikely that RGB DXT1 would turn into RGBA. Internally, RGB and RGBA use THE SAME compressed format. The de-compressor looks at the two palette entries for the block, and if the first is numerically greater than the second, then the four values of each pixel in the block consist of the two colors, their average, and a black/transparent color. If the first is numerically smaller, then the four pixel values consist of the two colors, and interpolated colors 1/2 and 2/3 between them. The fact that you specify RGB or RGBA is more an attempt to make this special case fit into the OpenGL model; the hardware doesn’t care.
And, specifically, it is not true that black is always transparent in DXT1; however, it IS true that transparent is always black in DXT1.[/b]
(I have written my texture compression code myself, so I sort of know how it behaves, although I might have started to forget something…)
RGB_DXT1 has a specific bit pattern for black, so it is pretty certain my compressor uses it if it needs black color. RGBA_DXT1 behaves exactly as RGBA_DXT1, except that
it sets alpha = 0 for that same bit pattern
and = 1 otherwise.
So I still think the symptoms look like the hardware for some reason uses my RGB_DXT1 texture as RGBA_DXT1 texture. Because of the similarity of the formats the effect is not more chaotic. Also because of the similarity, I wouldn’t be surprised if this would be implemented with a single config bit somewhere, maybe there is something wrong with handling that bit when sharing textures across rendering contexts.
(BTW. unlike you write the hardware must care about the difference, because the alpha value from the texture stage should be different with these two formats. And obviously it is
except on my PBuffer case, which is my problem)
Of course it is possible that my problem is caused by some bad pointer overwriting memory used by the Nvidia driver, I can only be sure of that after I have made a suitably small test case
Eero
[This message has been edited by epajarre (edited 08-03-2003).]