PDA

View Full Version : glCopyTexSubImage2D and alpha values



Zeno
04-05-2001, 03:56 PM
Hi all. This seems like a dumb question, but I can't figure out what I'm doing wrong.

Does glCopyTexSubImage2D() copy alpha values as well? It doesn't seem like it based on what I'm getting, but I think it should.

I've narrowed it down to a test like this:

// Fill texturemap with completely transparent color (black)
glClearColor(0.0, 0.0, 0.0, 0.0);
glClear(GL_COLOR_BUFFER_BIT);
glCopyTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, 0, 0, x, y);
// Clear screen to white
glClearColor(1.0f, 1.0f, 1.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
// Rest of code draws textured poly over the entire screen

I have blending enabled with (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) and am using texture blend mode GL_MODULATE.

Now, you would expect this to give a white screen since the textured poly has alpha=0. Unfortunately, the screen shows up black!?

Am I doing something wrong here? Could this be a driver bug? (I'm using 11.0) Any ideas would be greatly appreciated.

Thanks,
-- Zeno

Relic
04-05-2001, 11:15 PM
This need more informations to be answered:
- Which colorres are you running in?
- Does your pixelformat contain alpha bits?
- How did you call glTexImage2D before to set the internalformat?

DaViper
04-06-2001, 03:50 AM
a little bit more code of the drawing and blending part would help too

j
04-06-2001, 06:28 AM
Are you running in 32 bit color? For the glCopyTexSubImage to work, you need to have an alpha channel in the color buffer. 16 bit color won't work.

j

Tom Nuydens
04-06-2001, 06:55 AM
Relic summed it up perfectly:
- You MUST run in 32 bpp (not 24, not 16, and definitely not less than 16).
- Your pixel format should be 32 bit RGBA as well.
- You need to use GL_RGBA as the internal format when you first call glCopyTexImage2D().

This worked for me. I just finished a demo that renders textures using a pbuffer, and I had no problem making the texture transparent. If you want to see it, the demo will be on my site in a few hours.

- Tom

mcraighead
04-06-2001, 08:42 AM
glGetInteger(GL_ALPHA_BITS, &x);

should tell you what's going on.

Note that we cannot accelerate a copy from 32-bit mode without destination alpha, to an RGBA8 texture. With destination alpha you can copy to either RGB8 or RGBA8 fast, and without you can still copy to RGB8 fast.

(In 16-bit, of course, you should stick to RGB5.)

- Matt

Zeno
04-06-2001, 04:04 PM
Here's the scoop:

I am running windows in 32 bit color mode (windows 2k).

I am using GLUT, and I initialize the window with glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);

I set both the internal and external texture formats using glTexImage2d to be GL_RGBA. Is there a difference between this and GL_RGBA8?

Now, here's something curious: I tried glGetInteger(GL_ALPHA_BITS, &x), and it returns 0! I guess this is the source of my problem? I am able to do lots of alpha blending in this program, it's only the texture I get from the backbuffer that doesn't have an alpha channel. I assume all my other blending worked since I'm using GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA and the destination (backbuffer) alpha never gets used?

Is there a way to use GLUT to do what I want or do I have to do windows stuff http://www.opengl.org/discussion_boards/ubb/frown.gif ?

Thanks,
-- Zeno

HFAFiend
04-06-2001, 04:11 PM
16bit color works if you're in 4,4,4,4 mode (not 5,6,5,0 mode)

mcraighead
04-06-2001, 05:26 PM
GLUT_ALPHA

- Matt

Zeno
04-06-2001, 06:30 PM
Thanks guys...you saved me tons of time. Not adding GLUT_ALPHA to the display mode was the problem.

-- Zeno