PDA

View Full Version : GL_RGB to GL_ALPHA



Matthew Wahab
03-25-2008, 11:09 AM
Hi,

I'm trying to pass RGB texture data to OpenGL and have it internally represented as GL_ALPHA.

ex. Alpha = x*R + y*G + z*B, where x,y,z are some constants

I've tried:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, w, h, 0, GL_ALPHA, GL_UNSIGNED_BYTE, data);

but don't get the correct results.

Also, I know I can do this easily by iterating over each pixel and converting them myself but want to avoid the CPU overhead if possible.

Help!!

(NOTE: I can't use shaders, this has to be done with the fixed-function pipeline)

CRasterImage
03-25-2008, 11:32 AM
You have a pointer to the data? (pixels)

If so, why don't you just alter the data?
Or, if you need to preserve the data, alter a copy?

zed
03-25-2008, 11:41 AM
x*R + y*G + z*B
is a dot/vectorproduct
so u could just go

gl_FragColor.w = dot( texture2D( tex0, texcoord ).xyz, xyz );

Matthew Wahab
03-25-2008, 12:20 PM
Thanks, but I'm trying to do it WITHOUT using shaders.

i.e. using a stride when loading the texture, or some kind of texture combiner that would put a linear combination of the RGB in the alpha channel.

-NiCo-
03-25-2008, 12:26 PM
If I understand correctly you're trying to send RGB data to the GPU and you would like it to store it in an alpha texture, right?

You can try using the color matrix:



float RGB_2_I[16] = {0.0 ,0.0 ,0.0 ,0.0,
0.0 ,0.0 ,0.0 ,0.0,
0.0 ,0.0 ,0.0 ,0.0,
0.30,0.59,0.11,0.0};

glMatrixMode(GL_COLOR);
glLoadTransposeMatrixf(&RGB_2_I[0]);

glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE, datapointer);

glLoadIdentity();