glBlendFunc(GL_SRC_ALPHA,GL_ZERO) problem on nVIDIA

I’m writing an opengl application that renders a gray scale image of the alpha values in the frame buffer (used for alpha keying in broadcast video):

// draw scene

// Gray scale rendering of alpha values
glEnable(GL_BLEND);
glBlendFunc(GL_DST_ALPHA, GL_ZERO);

draw white quad the size of the viewport…

Works perfectly on ATI hardware.
When using a PNY Quadro FX 3000G (52.76) I get a white quad (not the gray scale alpha key of what’s in the framebuffer). Same problem on a Geforce 440MX with various driver revisions.
When I use glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA), blending is as expected both on ATI en nVIDIA.

Is it a driver problem or am I missing something?
I’ve mailed sdkfeedback@nvidia.com, so far no response from nVIDIA.

[This message has been edited by hcatry (edited 12-08-2003).]

Sure you got an alpha-channel?

You have to use 24 Bit Z, 8 Bit Stencil and 32 Bit color and to request 8 Bit for red, green, blue and the alpha planes.
If you don´t do this, it´s possible you don´t get an alpha-channel -> DST_ALPHA works as if alpha is always 1.

You can easily check if you got an alpha channel with glGetInteger (alpha_bits or so).

Jan.

Originally posted by Jan2000:
[b]Sure you got an alpha-channel?

You have to use 24 Bit Z, 8 Bit Stencil and 32 Bit color and to request 8 Bit for red, green, blue and the alpha planes.
If you don´t do this, it´s possible you don´t get an alpha-channel -> DST_ALPHA works as if alpha is always 1.

You can easily check if you got an alpha channel with glGetInteger (alpha_bits or so).

Jan.[/b]

You’re wright! I used SDL for seting up the window and didn’t have an alpha channel. Changed the code to native windows and specified a pixelformat with an alpha buffer and it works correctly.

Still wondering why it worked on ATI hw before…
Thx!

Originally posted by hcatry:

Originally posted by hcatry:
Still wondering why it worked on ATI hw before…

You can often get alpha for free, with memory alignement:
R G B . Z0 Z1 Z2 S
R G B A Z0 Z1 Z2 S

So it may be enabled if you don’t explicitly disable it. Or not.

Originally posted by hcatry:
[b][…]
Still wondering why it worked on ATI hw before…
Thx!

[/b]

that’s because, when you request a pixelformat, you get only a format that “best matches your requirements”. it does not mean anything about what you are actually getting…ie. the driver may decide that an similar(but not equal) format will do the job better that that one you have requested, and so it gives you the best accelerated-mode instead of the best-request-matching-mode.