Blenders look different on various graphics adapters

Hello everybody.

I use glBlendFunc with many combinations of source and destination factor to create various additive and blend effects. For some reason, some of those blends look different on ati and other adapters, than on my nvidia adapter. Some blends have black results instead of white, some look different colored.
Does anybody know why?
I’m not using extentions. I use glAlphaFunc, glColorMaterial and textures with GL_MODULATE texture env.

Do you mean glBlendFunc?

Also what about destination alpha? This can have a profound effect on your results if you rely on it but should be the main factor in any variations you see (in theory :-)).

Ye, I mean glBlendFunc with various parameters for src. and dest. factor.

Those are the values I use for source factor:
GL_ZERO, GL_ONE, GL_DST_COLOR,GL_ONE_MINUS_DST_COLOR, GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA, GL_DST_ALPHA,GL_ONE_MINUS_DST_ALPHA, GL_SRC_ALPHA_SATURATE.

And those are values for dest factor:
GL_ZERO, GL_ONE, GL_SRC_COLOR, GL_ONE_MINUS_SRC_COLOR, GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA, GL_DST_ALPHA, GL_ONE_MINUS_DST_ALPHA

In some combinations of parameters above, the result is different on ati and intel chips. While on my nvidia the result is white, there is for example black result on other chips. Also transparency (by color or by alpha) looks negative.

… I use ordinary 2D RGB and RGBA textures, 16 and 32 bit.

Originally posted by dorbie:
Also what about destination alpha?
That would be where I would double-check first as well. Even if you request 0 bits of Alpha, on one implementation you might have gotten an RGBA pixelformat, while on the other you might have gotten an RGB pixelformat. Check ALPHA_BITS. (One of my favorite enums, long ago one of my favorite cereals.)

-mr. bill

The destination alpha can be a problem. It depends on the framebuffer pixelformat and how the driver writes alpha value to framebuffer. I’d rather not use destination alpha in BlendFunc if it is not my only choice.

YES, THANX!
The problem is when I’m using GL_DST_ALPHA and GL_ONE_MINUS_DST_ALPHA. Now I see, the result-picture using those parameters depends on framebuffer and something around GL_BITS.
I don’t have a time now to study the stuff around, so I’ll rather use only the blends withouth GL_DST_ALPHA and GL_ONE_MINUS_DST_ALPHA.
That works.
I hope that will be okay, I’ll fully test it later and write the result.

Better solution found! :slight_smile:
I’ve called glGetIntegerv( GL_ALPHA_BITS ) and the result was 0 bits. It means no alpha written to my color buffer on my nvidia, but some alpha written on another adapters with RGBA color buffer format.
So, using GL_DST_ALPHA was silly, beacause the dst alpha was always 1 using RGB color format.
I’ve replaced GL_DST_ALPHA with GL_ONE and I’ve replaced GL_ONE_MINUS_DST_ALPHA with GL_ZERO.
And now the picture-result is the same as before and it looks same on another adapters with either RGB or RGBA color buffer format.

THANKS AGAIN.

… and yes, it works on ati, too. I’ve just tested it.
Hope this could help someone else.
:wink: