PDA

View Full Version : Render to Texture and keeping alpha



uzu_manga
08-20-2005, 08:31 AM
The problem (probably easy to solve) is as follow. When I do a render to texture using glBindTexture and glCopyTexImage2D



glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT)
glBindTexture(GL_TEXTURE_2D,self.render_texture.te xID)
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, viewport[3]- size[1], size[0], size[1], 0)
glViewport(0 , 0,viewport[2] ,viewport[3])
The alpha is not transfered to the texture, i thought that an empty area ( read cleared but not written to ), should become transparent in the texture. Now the solution works with the latest Xorg ATI precompiled drivers, but it does not work with any nVidia drivers - so something is very wrong.

glTexImage2D(GL_TEXTURE_2D, 0, 4, size[0], size[1], 0, GL_RGBA, GL_UNSIGNED_BYTE, data)this is the command I use to produce the original empty texture.

tamlin
08-20-2005, 03:04 PM
I'm sorry to ask the most obvious question of all, but does your framebuffer even have an alpha channel, and is it set using glClearColor? If not, IIRC the default is opaque.

(sidenote: I think close to 100% of developers will agree that GL_RGBA should be preferred over the ancient integral value 4 used in the displayed glTexImage2D call)

Brorsson
08-21-2005, 04:03 AM
Well I also thought that could be the case but I'm using: glClearColor( 0.5, 0.5, 0.5, 0.0 ), so the alpha value is set to 0 not 1 that is opaque. An alpha value off 1.0 will give me the same result on both ATI and nVidia - gray background.

jra101
08-22-2005, 04:55 PM
If you query GL_ALPHA_BITS with glGetIntegerv, do you get a non zero value?

uzu_manga
08-25-2005, 08:34 AM
glGetIntegerv(GL_ALPHA_BITS) returns 0 on the nvidia drivers, I havn't tried it on any ati drivers yet, well if I'm not misstaken that 0 should be a 1...

Brorsson
08-26-2005, 02:59 PM
SOLVED!
I finaly realized that I needed to request a 8 bit alpha channel, after doing that everything worked just fine. Stupid me...