I have an application that renders to the screen. I then switch it to rendering to an FBO. I’m using the GL_COMBINE texture environment mode as follows:
The results render to the screen satisfactorily, however, the results in the FBO have a largely transparent alpha channel. The FBO is cleared to opaque before rendering begins.
If I force the alpha to opaque, after rendering, the FBO then matches the screen rendering. I have tried all manner of GL_COMBINE_ALPHA options to no avail.
Apart from writing a shader, do you have any suggestions?
The primary color and alpha are from a small texture. The intent is to use a constant color with an alpha that is a modulated version of the constant alpha.
The constant color and alpha are set with a call to:
The modulation of the alpha I intend is just as you describe. I want the texture alpha to be multiplied by the constant alpha. If the alpha a textel is zero (transparent) then I want that that textel to remain transparent. If the textel is opaque and the constant is 0.5 then I want the result to be semi-transparent.
I clearly don’t understand how to specify the operands. Somehow I thought specifying GL_CONSTANT implicitly defined an operand and that GL_TEXTURE implicitly defined another. I will re-read the description in the RED book and restart my experiments.
Here’s a commented (and slightly corrected) version of your code.
glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_REPLACE); // replace RGB operation (assignment)
glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE0_RGB, GL_CONSTANT); // source for RGB data
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR); // destination for RGB data (GL_SRC_COLOR doesn’t quite make sense?)
glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_MODULATE); // modulate an alpha with an alpha (multiply)
glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE0_ALPHA, GL_TEXTURE); // first alpha operand is from the texture
glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE1_ALPHA, GL_CONSTANT); // second operand is the constant alpha
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); // destination for the alpha data.
Can you please confirm that I’m on the right track?
OK, I know that’s not what you asked for, but think about it for a bit. You’ve got FBO support so you also definitely have shader support, and you’ll be able to express the operation(s) you want to do far more clearly and concisely (and less bug-prone or vulnerable to stray state changes messing things up).
So here’s my latest iteration, with operands specified correctly (I think)
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);
glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_REPLACE); // replace RGB operation (assignment)
glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE0_RGB, GL_CONSTANT); // source for RGB data
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR); // operand for source
glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_MODULATE); // modulate an alpha with an alpha (multiply)
glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE0_ALPHA, GL_TEXTURE); // first alpha operand is from the texture
glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE1_ALPHA, GL_CONSTANT); // second operand is the constant alpha
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); // operand for source
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND1_ALPHA, GL_SRC_ALPHA); // operand for source
But still the FBO is mostly transparent and the screen rendered version is OK. It seems my first version (without operands) probably worked because the default behavior is to select unmodified source values, at least with the OpenGL driver I have.
So I still have my original problem:
FBO rendering != screen rendering
I’m guessing I have the texture environment coded satisfactorily, so it then must mean there is something wrong with the FBO set-up.
Mine looks like this:
// create a framebuffer object
glGenFramebuffersEXT(1, &frame_buffer_id);
// GL_FRAMEBUFFER target simply sets both the read and the write to the same FBO.
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, frame_buffer_id);
if(!glIsFramebuffer(frame_buffer_id))
{
program_error(__FILE__, __LINE__, __FUNC__,
"glIsFramebuffer failed");
}
// create a renderbuffer object to store the image
glGenRenderbuffersEXT(1, &render_buffer0_id);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, render_buffer0_id);
if(!glIsRenderbuffer(render_buffer0_id))
{
program_error(__FILE__, __LINE__, __FUNC__,
"glIsRenderbuffer failed");
}
// We are guaranteed to be able to have at least color attachment 0
// attach the renderbuffer to GL_COLOR_ATTACHMENT0_EXT
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT,
GL_RENDERBUFFER_EXT, render_buffer0_id);
OpenGL_error_check(__FILE__, __LINE__, __FUNC__);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_RGBA, width, height);
OpenGL_error_check(__FILE__, __LINE__, __FUNC__);
// now check FBO completeness status
There is no depth buffer but that shouldn’t affect the alpha rendering…
Is there anything wrong here? In what ways can the screen buffer differ from an FBO when rendering alpha?