Render to Texture and keeping alpha

The problem (probably easy to solve) is as follow. When I do a render to texture using glBindTexture and glCopyTexImage2D

 
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT)
glBindTexture(GL_TEXTURE_2D,self.render_texture.texID)
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, viewport[3]- size[1], size[0], size[1], 0)
glViewport(0 , 0,viewport[2] ,viewport[3])
 

The alpha is not transfered to the texture, i thought that an empty area ( read cleared but not written to ), should become transparent in the texture. Now the solution works with the latest Xorg ATI precompiled drivers, but it does not work with any nVidia drivers - so something is very wrong.

glTexImage2D(GL_TEXTURE_2D, 0, 4, size[0], size[1], 0, GL_RGBA, GL_UNSIGNED_BYTE, data)

this is the command I use to produce the original empty texture.

I’m sorry to ask the most obvious question of all, but does your framebuffer even have an alpha channel, and is it set using glClearColor? If not, IIRC the default is opaque.

(sidenote: I think close to 100% of developers will agree that GL_RGBA should be preferred over the ancient integral value 4 used in the displayed glTexImage2D call)

Well I also thought that could be the case but I’m using: glClearColor( 0.5, 0.5, 0.5, 0.0 ), so the alpha value is set to 0 not 1 that is opaque. An alpha value off 1.0 will give me the same result on both ATI and nVidia - gray background.

If you query GL_ALPHA_BITS with glGetIntegerv, do you get a non zero value?

glGetIntegerv(GL_ALPHA_BITS) returns 0 on the nvidia drivers, I havn’t tried it on any ati drivers yet, well if I’m not misstaken that 0 should be a 1…

SOLVED!
I finaly realized that I needed to request a 8 bit alpha channel, after doing that everything worked just fine. Stupid me…