GL_RGB to GL_ALPHA

Hi,

I’m trying to pass RGB texture data to OpenGL and have it internally represented as GL_ALPHA.

ex. Alpha = xR + yG + z*B, where x,y,z are some constants

I’ve tried:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, w, h, 0, GL_ALPHA, GL_UNSIGNED_BYTE, data);

but don’t get the correct results.

Also, I know I can do this easily by iterating over each pixel and converting them myself but want to avoid the CPU overhead if possible.

Help!!

(NOTE: I can’t use shaders, this has to be done with the fixed-function pipeline)

You have a pointer to the data? (pixels)

If so, why don’t you just alter the data?
Or, if you need to preserve the data, alter a copy?

xR + yG + z*B
is a dot/vectorproduct
so u could just go

gl_FragColor.w = dot( texture2D( tex0, texcoord ).xyz, xyz );

Thanks, but I’m trying to do it WITHOUT using shaders.

i.e. using a stride when loading the texture, or some kind of texture combiner that would put a linear combination of the RGB in the alpha channel.

If I understand correctly you’re trying to send RGB data to the GPU and you would like it to store it in an alpha texture, right?

You can try using the color matrix:


float RGB_2_I[16] = {0.0 ,0.0 ,0.0 ,0.0,
		     0.0 ,0.0 ,0.0 ,0.0,
		     0.0 ,0.0 ,0.0 ,0.0,
		     0.30,0.59,0.11,0.0};

glMatrixMode(GL_COLOR);
glLoadTransposeMatrixf(&RGB_2_I[0]);

glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE, datapointer);

glLoadIdentity();