ATI, blending and pbuffer

Hi,

After switching to an ATI card, I ran into a strange problem. When rendering into a pbuffer, enabling blending causes corrupt texture coordinates. The ramp texture I’m using produces dense vertical lines. This is particularily strange because I have GL_CLAMP_TO_EDGE enabled for the texture. If I disable blending or render to backbuffer instead of a pbuffer, the problem vanishes. I’ve tried disabling texgen (not that it was enabled to begin with), changing texture dimensions and format, using immediate and retained mode and nothing helps. If I change the contents of the texture, the results change, though.

Specs:
-Radeon 9600, catalyst 4.7
-1024x1024 pbuffer
-GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA

On several NVidia boards everything has been fine. I haven’t tested it on other ATI boards or drivers. Any ideas?

-Ilkka

Does the ramp texture have an explicit alpha channel?

Yes, it’s a GL_ALPHA texture.

Is it GL_ALPHA, or GL_ALPHA8? Tried changing the texture format to something with both color and alpha and see if it affects it in any way?

Hi,

I’ve tried ALPHA, ALPHA8 and RGBA8, and the format doesn’t seem to make any difference.

I wrote a small test app to demonstrate the problem. It’s draws something to the back/pbuffer and copies it to screen. The results should look identical, but on my computer they don’t when blending is enabled. Here is a picture of what I’m getting.

-Ilkka

I took a look at this, but the code is in Delphi, and I don’t have any Delphi compiler available. You don’t happen to have any C/C++ code showing the problem?

No, sorry. Thanks for looking at it, though.

I might get a c compiler and try to wrap up a c version if I find some time, but it’s not likely to happen anytime soon. It would take a fraction of that time for somebody else to write it, after all it’s a very simple program. That is if you agree that it looks like a driver problem, not mine.

-Ilkka

Well, I quickly threw together an app doing something similar as yours, but couldn’t reproduce the problem at first. Then I changed format of the pBuffer from RGBA8 to RGBA16, and the problem appeared. Judging your code it seems you intended to create an RGBA8 pBuffer, right? Have you checked that you’re getting the right format? I see there’s a loop in there, that I’m not so sure about. You may want to verify that it works properly.

Still though, RGBA16 blending seems broken in some strange way. Now that I think of it though, it looks very similar to another bug we had not too long ago, but that one has been fixed. Then it was pBuffer + blending + AUX buffers that was the problem, but the output looked very similar to what we’re getting here. Anyway, I’ll investigate this further tomorrow.

You’re right, I was getting a RGBA16 buffer. The loop only makes sure the pbuffer gets properly generated (propably unnecessary), it doesn’t check the format.

But: I can’t get an RGBA8 buffer, no matter what I try. I’ve tried specifying color bits and seperate red, green and blue bits, and disabling all possible options, only leaving
WGL_DRAW_TO_WINDOW_ARB, GL_FALSE,
WGL_DRAW_TO_PBUFFER_ARB, GL_TRUE,
WGL_SUPPORT_OPENGL_ARB, GL_TRUE,
WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB

I check all available modes, and they’re all either 16 or 32 bits per component. Using one of the 32 bit modes removes the problem, but causes a huge performance drop when enabling blending.

-Ilkka

Try removing WGL_DRAW_TO_WINDOW_ARB, GL_FALSE.

There is really no reason to ask for a format that cannot be used with windows.

Right… That does it. Thanks.

-Ilkka

Originally posted by JustHanging:
Using one of the 32 bit modes removes the problem, but causes a huge performance drop when enabling blending.
That’s because floating point blending is not supported, so it goes into software.