PDA

View Full Version : can't figure out alpha blitting



Jose Goruka
11-27-2008, 06:51 PM
So, imagine i created a FBO, with RGBA format (has alpha channel).
I want to write values to the alpha component of it from a fragment program, however wathever i write to it using gl_FragData[], ends up being "1" (opaque).. so i'm thinking it must be the blend mode.. i want it to write my arbitrary value to the alpha channel from the fragment program.... so, what blend mode should i use so instead of writing 1.0 always to alpha, it just overwrites alpha with what i write in gl_FragData??

thanks

Ilian Dinev
11-27-2008, 10:03 PM
I never had that problem, alpha was there and with the computed value. No blending involved.
nVidia cards.

MrMacete
11-28-2008, 01:03 AM
how did you create the context? has it alpha channel enabled?

Jose Goruka
11-28-2008, 03:02 AM
It seems i didn't made the question clear, question is:

Hpw do you set up blending so what I want to achieve happens?
I tried disabling GL_BLEND, and enabling GL_BLEND with glBlendFunc(GL_ONE,GL_ZERO), but nothing happens, alpha is always 1 in the destination buffer no matter what value i write to it from the fragment program.

ZbuffeR
11-28-2008, 03:18 AM
As already said, blending has nothing to do with this, just disable it.
Have a look at the FBO creation, apparently is was not created with an alpha channel. Even, without shader, you can try glClear it with a glCLearColor(,,,0.5), check you see this 0.5 alpha value.

MrMacete
11-28-2008, 03:22 AM
I tried disabling GL_BLEND, and enabling GL_BLEND with glBlendFunc(GL_ONE,GL_ZERO), but nothing happens


this should make you think that blending is not the cause of your problem, as Ilian already told :)

can you post a snippet of code? i think the problem is in how you create the context (eg. if you use glut i want to see your glutInitDisplayMode parameters... )

Jose Goruka
11-28-2008, 03:31 AM
I'm using x11/glx directly, but does the way the context is created matter? as in, this all happens when rendering to a FBO, not the regular framebuffer..

Jose Goruka
11-28-2008, 03:37 AM
FBO creation code:



glGenFramebuffersEXT(1, &data_fbo.fbo);
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, data_fbo.fbo);

glGenTextures(1, &data_fbo.data);

glBindTexture(GL_TEXTURE_2D, data_fbo.data);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA16F_ARB, data_fbo.width, data_fbo.height, 0, GL_RGBA, GL_FLOAT, NULL);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, data_fbo.data, 0);


then the fragment program that writes to the FBO does:


gl_FragData[0] = vec4( 0.5,0.8,0.5, 0.1 );


then finally, for testing what was written to the FBO, i run a fragment program that binds back the normal framebuffer, and binds the FBO texture (data_fbo.data) as regular texture, and tried the following code (writing a fullscreen quad)


vec4 data = texture2D( data_fbo_tex, gl_TexCoord[0].st );
gl_FragColor.rgb = data.rgb;


which works flawlessly (i see the color 0.5, 0.8, 0.5)
but if instead, i write this line:



gl_FragColor.rgb = vec(1,1,1) * data.a;

Everything is white, instead of color 0.1,0.1,0.1, which is what i expect, given it's what i saved into the alpha component.

MrMacete
11-28-2008, 03:43 AM
I'm using x11/glx directly, but does the way the context is created matter? as in, this all happens when rendering to a FBO, not the regular framebuffer..

yes, you don't use the regular framebuffer, but the GL context is the same!

if you do not enable the alpha channel in the context, you cannot write to it.

note that normally you don't need to have it enabled because your window is not transparent, and what you see is the result of blending. when you use an fbo, though, you need to enable it explicitly because no blending occours, you really need alpha on the context.

Jose Goruka
11-28-2008, 03:48 AM
I create the context with the following values:

GLX_RGBA,
GLX_DOUBLEBUFFER,
GLX_RED_SIZE, 4,
GLX_GREEN_SIZE, 4,
GLX_BLUE_SIZE, 4,
GLX_DEPTH_SIZE, 24,
None

Should i enable alpha somewhere else?

MrMacete
11-28-2008, 04:31 AM
ok, maybe i'm wrong, i'm not experienced with glx, but i observed a similar behaviour using glut or qt. for example in glut (windows) i need to enable the alpha channel even in the window draw buffer to make it work in FBO buffer too during shader output.

Jose Goruka
11-28-2008, 04:32 AM
you can try glClear it with a glCLearColor(,,,0.5), check you see this 0.5 alpha value.

hmm doesn't seem to be there either, so it seems the FBO actually is lacking alpha?

Jose Goruka
11-28-2008, 04:46 AM
ok, maybe i'm wrong, i'm not experienced with glx, but i observed a similar behaviour using glut or qt. for example in glut (windows) i need to enable the alpha channel even in the window draw buffer to make it work in FBO buffer too during shader output.

Well, how do you do this in glut or qt, so i try to make an idea (look at their source code) on how to do it on GLX ?

MrMacete
11-28-2008, 04:54 AM
in GLUT:

glutInitDisplayMode( GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGBA | GLUT_ALPHA );

in QT:

QGLFormat format = QGLFormat( QGL::DoubleBuffer | QGL::Rgba | QGL::DirectRendering | QGL::AlphaChannel | QGL::NoSampleBuffers );

and pass it to the constructor of QGLWidget

ZbuffeR
11-28-2008, 05:19 AM
4 bits seem strange, try 8, and add alpha:

GLX_RGBA,
GLX_DOUBLEBUFFER,
GLX_RED_SIZE, 8,
GLX_GREEN_SIZE, 8,
GLX_BLUE_SIZE, 8,
GLX_ALPHA_SIZE, 8,
GLX_DEPTH_SIZE, 24,

_NK47
11-28-2008, 06:30 AM
value of 4 is totally weird. makes up 12 bits per pixel? im surprised it even got running.

Jackis
11-28-2008, 11:49 AM
Don't forget to unmask alpha write. I mean, be sure that you have called glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE)

Jose Goruka
11-28-2008, 12:20 PM
voila! That fixed it! It seems i needed to set alpha in the mainf framebuffer, otherwise i won't have alpha in the FBOs.. weird..

thanks a lot!

arekkusu
11-28-2008, 01:01 PM
That's definitely not true. The window's pixel format does not influence the format of FBO attachments. Double check GL_ALPHA_BITS with and without an FBO bound, and also the internal format of your FBO attachments.

Jose Goruka
11-30-2008, 06:26 AM
That's definitely not true. The window's pixel format does not influence the format of FBO attachments. Double check GL_ALPHA_BITS with and without an FBO bound, and also the internal format of your FBO attachments.


I understand that the format isn't the same, but it seems if i don't define alpha bits at all my FBOs don't have alpha channel created on them. Maybe this is a driver problem from nvidia?

arekkusu
11-30-2008, 11:35 AM
Look at the actual data you have.

Bind the FBO. Then,

GLint bits;
glGetIntegerv(GL_ALPHA_BITS, &bits);
printf("The FBO drawable actually has %d alpha bits\n", bits);

If you bind back to FBO zero and repeat this query, you should get zero if the window drawable really has zero alpha bits.

Also double check the internal format of the attachment you're rendering into. It's possible for the driver to pick a different precision than the one you requested, but if you asked for alpha when you created the attachment, the driver has to give you alpha.

If you're rendering into a texture:
GLint internal;
glBindTexture(<texture target>, <your attachment id>);
glGetTexLevelParameteriv(<texture target>, <level you attached to the fbo>, GL_TEXTURE_INTERNAL_FORMAT, &internal);
printf("Actual texture internal format is %04x\n", internal);

Or, if you're rendering into a renderbuffer:
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, <your attachment id>);
GetRenderbufferParameterivEXT(GL_RENDERBUFFER_EXT, GL_RENDERBUFFER_INTERNAL_FORMAT_EXT, &internal);
printf("Actual renderbuffer internal format is %04x\n", internal);

If you created the attachment with a generic internalformat "RGBA", this should come back as something like RGBA8.