View Full Version : fading problem

08-13-2009, 10:05 PM
Hi folks,

I'm doing something pretty basic, but being the beginner I am, I've run into a brick wall, head first.

I've got a 2D (orthographic) app. I have a "window" assembled by a series of quads and line loops, and I want to fade it out. Is there a way to do this without modifying the glColor alpha parameter for every single vertex?

Could I do something like this?
1. Enable lighting
2. Set the lighting alpha value to match the current alpha at this point in the fade
3. Render the "window"
4. Disable lighting

It seems like quite a hack, and when I've tried it, the quads and line loops turn white. The code I'm using for the fade effect is below:

void Fade::Begin()
GLfloat light0[] = {0.0f, 1.0f, 1.0f , 1.0f};
glLightfv(GL_LIGHT0, GL_AMBIENT, light0);
Glfloat global_ambient[] = {0.5f, 0.5f, 0.5f, m_alpha};
glLightModelfv(GL_LIGHT_MODEL_AMBIENT, global_ambient);

void Fade::End()

I feel like I'm going entirely in the wrong direction. Any suggestions?

08-14-2009, 09:41 AM
Hmm. No ideas?

08-15-2009, 06:40 AM
Indeed this seems quite fragile.
A simple and robust (but not very fast) method is to render a fullscreen quad on top of the unfaded frame, with varying alpha settings over time.

Something like :

-draw scene (render the "window")
-enable blending :
glEnable (GL_BLEND);
-alpha goes from 1.0f to 0.0f for a fade to black
-draw full screen quad