PDA

View Full Version : front-to-back compositing



Budde
10-28-2003, 04:30 AM
I need front-to-back compositing functionality and my first intuition was to simply use

(GL_ONE_MINUS_DST_ALPHA, GL_DST_ALPHA) as blending parameters.
A blank screen leads to the idea, that the color buffer could be only RGB. But



unsigned char bMode;
glGetBooleanv(GL_RGBA_MODE, &bMode);

results in ONE, so I decided to have a closer look into the buffer. Then I noticed, that the alpha channel was everywhere ONE, so rendering an object with the above blending function must of course result in (0,0,0,0). So clearing the buffer with (0,0,0,0) before drawing should make a rendered object visible exactly with it's color and alpha values, doesn't it?
Finally I observed, that clearing the buffer actually doesn't clear it ;-)




// get (back) buffer
glGetIntegerv(GL_DRAW_BUFFER, &bufferIndex);

// and clear it
glClearColor(0.0,0.0,0.0,0.0);
glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );

// just to be sure...
glFinish();

// have a closer look at some region of the buffer
float pMem[10*10*4];
glReadBuffer(bufferIndex);
glReadPixels(100, 100, 10, 10, GL_RGBA, GL_FLOAT, pMem);

Now I'm completely confused....

Who can help me?

Deiussum
10-28-2003, 04:44 AM
Usually you won't get a destination alpha by default unless you explictly request one. How you request one depends on how you are setting up your window. With Win32 stuff, you have to set the number of alpha bits you want in the PFDPIXELFORMATDESCRIPTOR. With glut, I think you use GLUT_ALPHA as one of the flags in your call to glutInitDisplayMode.

Relic
10-28-2003, 04:48 AM
It's GL_RGBA_MODE vs. color index mode.
Having a GL_RGBA_MODE doesn't mean you have a destination alpha, only that the OpenGL works on RGBA values instead of color indices.
You need to query the GL_ALPA_BITS or use DescribePixelFormat and look at the cAlphaBits!

Yes, if you have no destination alpha buffer a read of alpha will return all 1's.

Make sure your glReadBuffer is set correctly, too. For double buffering it defaults to GL_BACK for draw and read initially.

Oh, and if even the clear doesn't work, make sure you have a rendering context current while you're using OpenGL.

[This message has been edited by Relic (edited 10-28-2003).]

Budde
10-28-2003, 07:04 AM
Thanks for the quick answer!
OK, I thought that GLUT_RGBA in the glutInitDisplay Mode(...) directly leads to a buffer with alpha... Problem is, that I can't call this method on my own and surely I can't change the PixelFormat since it's already set!?
What about creating another context with the desired settings and after rendering the whole scene into this one copying all pixels to the (first) framebuffer? Is this the only/best solution for me now in face of being expensive?

Relic
10-28-2003, 08:10 AM
"Problem is, that I can't call this method on my own and surely I can't change the PixelFormat since it's already set!?"

You're working on a project where you cannot influence the OpenGL pixelformat? Ouch.

Yes, pixelformats can only be set once per window.
You need to create a whole new window with a different pixelformat then (not only an OpenGL context).

BTW, destination alpha will never work in 16 bit color resolutions.

To copy the data from one window to another would need a glReadPixels glDrawPixels through the host. This sucks big time and will not work if the image is occluded by any other window because the readback is undefined for unexposed regions. You can use a pbuffer to solve that.

[This message has been edited by Relic (edited 10-28-2003).]

Budde
10-28-2003, 09:36 AM
destination alpha will never work in 16 bit color resolutions
clear! ;-)


This sucks big time and will not work if the image is occluded by any other window because the readback is undefined for unexposed regions. You can use a pbuffer to solve that.
That's exactly what I thought of!!! I already use a pbuffer for light attenuation in the course of shadow computation. Using another one and especially the read/write pixel will additionally slow down the application that is anyway not very fast *g; so i actually wanted to avoid such proceeding....

Thanks anyway!

Budde
10-31-2003, 01:24 AM
Me again...
I solved the problem with the alpha bits, but nevertheless I don't understand the bearing of the above 'clearing-code' (in my first post). I have a valid rendering context and glGetError() returns nothing, so in principle it should work, doesn't it???