Simple mask - fill with colour?

Hey!
I’ve got a texture, and I’d like to flood on top of that a bit of colour with some transparency (say red, with alpha equal to 0.5).
What I’ve got as input data is the texture and the mask (white / black OR (!) white with shades of grey).
Now what’s the best and easiest way to go? I do not need the best performance available, what I need is good performance and portability :slight_smile:

Here’s what I’ve got:


unsigned char pixeldata[imagewidth * imageheight];
unsigned char mask[imagewidth * imageheight];

...

if (necessary)
     doThresholdOnMask(mask);

glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, imagewidth, imageheight, GL_LUMINANCE, GL_UNSIGNED_BYTE, pixeldata);
..
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, imagewidth, imageheight, GL_LUMINANCE, GL_UNSIGNED_BYTE, mask);

displayTexture();
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glAlphaFunc(GL_GREATER, 0.1f);
glEnable(GL_ALPHA_TEST);
displayMask();

and this gives me almost what I need, except ‘the mask’ is black and I’d like it to be reddish (with some transparency). Do I need to do some multi-texturing? Or maybe glTextEnv is the way to go (somehow)? Or maybe does setting GL_ALPHA as format in glTexImage2D would be something for me? What does it do btw? I do not understand completely all the manuals.

I’ve found this: http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=20 and it looks good, but since I only need some colour I guess I do not need to draw first the texture, then put the mask and then flood it with red. Or maybe it’s the best solution?

Now, it would be perfect, if the mask could be in shades of grey, and the transparency of my colour could depend on these values (but this is not essential)

What’s the best thing to do?

Thanks for any help,
Kornel

Use per vertex or global color RGBA and set a GL_MODULATE texture environment using glTexEnvi, it should be the default.

The real problem is the use of LUMINANCE, is this an assignment because another guy asked a similar question without your detailed groundwork and without getting as far with the texture setup…

If you can use a LUMINANCE_ALPHA image, even better many platforms support just GL_ALPHA which is ideal for you, but I’m not sure how portable it is. I’d suppose it’s actually very common these days but you can check that.

If you’re prepared to use shaders (portability?) then you could swizzle your luminance to the alpha channel and do your color modulation in your shader.

Combiner operands never really allowed alpha to pull from color channels unfortunately.

Hey!
Thanks for your hint. It’s sort of an assignment, the ultimate one - it’s my msc thesis, which has nothing to do with OpenGL, I use it only to visualize some data :slight_smile:

Now here’s what I did:


glGenTextures(1, &mask);
glBindTexture(GL_TEXTURE_2D, mask);
glTexParameteri(...);
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA, imagewidth,
 imageheight, 0, GL_ALPHA, GL_UNSIGNED_BYTE, mask);

and the render code:


drawBackgroundTexture();
glEnable(GL_BLEND);
//glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); 
//glBlendFunc(GL_DST_COLOR, GL_ZERO);
//glBlendFunc(GL_DST_COLOR, GL_ONE_MINUS_SRC_ALPHA);
//glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
//glAlphaFunc(GL_GREATER, 0.1f);
//glEnable(GL_ALPHA_TEST);
	
glBindTexture(GL_TEXTURE_2D, mask);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); //should be default as you've mentioned
glBegin(GL_QUADS);		
glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
glTexCoord2f(0.0f, 1.0f); glVertex2i(0, 0);
glTexCoord2f(1.0f, 1.0f); glVertex2i(imagewidth, 0);
glTexCoord2f(1.0f, 0.0f); glVertex2i(imagewidth, mageheight);	
glTexCoord2f(0.0f, 0.0f); glVertex2i(0, imageheight);	
glEnd();

glDisable(GL_BLEND);

But this is wrong. I’d like to make sure I understand the glTexImage2D parameters:

internalFormat
Specifies the number of color components in the texture …

Ok, according to this GL_ALPHA tells OpenGL that what I have is only the alpha channel, hence every texel will be RGBA = (0, 0, 0, datafromfile), right?

format
Specifies the format of the pixel data. …

Specifying this as GL_ALPHA tells OpenGL that in the data array (given as the last parameter) he’ll find only the alpha channel (each value of size GL_UNSIGNED_BYTE).

So my code seems quite ok now, anyway, the result is wrong. Is it because of the glBlendFunc?

Thanks for any hints,
Kornel

p.s.
I forgot: I update the texture using


glBindTexture(GL_TEXTURE_2D, mask);
glTexParameteri(...)
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, imagewidth, 
imageheight, GL_ALPHA, GL_UNSIGNED_BYTE, overlay);

But this shouldn’t change much. BTW, is glTexParameteri necessary while doing glTexSubImage2D?

I think you have it right, teximage call looks OK for a GL_ALPHA texture creation but it may not be supported. Try glGetError after you make the call. I mean it SHOULD be supported, but …

I’ve seen some platforms really fussy about teximage formats.

Still you have GL_ALPHA to GL_ALPHA it should work.

TexParameteri in the update should be redundant, it’s restored to what was there for that bind handle.

You’re saying it’s working now right?

Your update code looks OK too at a glance

Yes, indeed, it works. I’ve had some trouble but it was the mask, I’ve supplied invalid data.

Thanks a lot!