Problem with GL_REPLACE and glAlphaFunc

I think I have found a bug in the microsoft generic implementation of OpenGL… or at least I am having some problems with it…

Basically if I use the GL_REPLACE texture environment and I enable the alpha func of

glAlphaFunc(GL_GREATER, 0.0)

I don’t get anything rendered.

if I am using an RGB texture. If I use an RGBA texture it works, if I don’t enable the alpha func it works, if I do glAlphaFunc(GL_GEQUAL, 0.0) it works. If I use GL_DECAL instead of GL_REPLACE it works.

I tested this on the Mac, and it seems to work ok too, and this only seems to be a problem if I am using the generic implementation on Windows machines, i.e, rendering to a DIB section.

Does anyone know if this is a known issue?

I can post some code if it would help too.

Thanks

What is your setup code for each texture unit? What texture format are you using? Also what hardware are you on? The texture format you are choosing might be internally set to rgba if you are using rgb and a set to 1.0…

What is your setup code for each texture unit?

here is some setup code

  
	glEnable(GL_ALPHA_TEST);
	glAlphaFunc(GL_GREATER,0.000000);
	glColorMask(TRUE,TRUE,TRUE,TRUE);
	glClearColor(1.000000,0.000000,0.000000,0.000000);
	glClearDepth(1.000000);
	//glEnable(GL_TEXTURE_2D);
	glGenTextures(1,&textureObj);
	glBindTexture(GL_TEXTURE_2D,textureObj);
	//glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP);
	//glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
	glTexEnvi(GL_TEXTURE_ENV,GL_TEXTURE_ENV_MODE,GL_REPLACE);
	glPixelStorei(GL_UNPACK_ALIGNMENT,1);
	glTexImage2D(GL_TEXTURE_2D,0,6407,16,16,0,GL_RGB,GL_UNSIGNED_BYTE, mytexture);

mytexture is just a big array 4096 GLubytes of 0xff.

here is the code for the drawing of the scene

 	glClear(GL_DEPTH_BUFFER_BIT | GL_COLOR_BUFFER_BIT);
	glMatrixMode(GL_MODELVIEW);
	glLoadIdentity();
	gluLookAt(3,3,3,0,0,0,0,1,0);

	glColor4f(1.000000, 0.000000, 1.000000, 1.000000);
	glBegin(GL_QUADS);
	glTexCoord2f( 0.000000, 0.000000 );
	glVertex3f(-1.000000, 1.000000, 0.000000 );
	glTexCoord2f(0.000000, 1.000000 );
	glVertex3f(-1.000000, -1.000000, 0.000000);
	glTexCoord2f(1.000000, 1.000000 );
	glVertex3f(1.000000, -1.000000, 0.000000);
	glTexCoord2f(1.000000, 0.000000);
	glVertex3f(1.000000, 1.000000, 0.000000);
	glEnd(); 

What texture format are you using?

I am using GL_UNSIGNED_BYTE with GL_RGBA

What hardware are you on?

I am using the generic Microsoft renderer, but incidentally it is a Radeon X600

The texture format you are choosing might be internally set to rgba if you are using rgb and a set to 1.0…

I think that would be fine, but it seems like the alpha is actually being set to 0.0, not so good.

well your glTexImage2D is sending a internal type of 6407 not GL_RGBA8…

6407 is GL_RGB, isn’t this legit? Also there is a mistake in my posted code…

the glEnable(GL_TEXTURE_2D) should be uncommented.

sorry and thanks for the reply.

Incidentally I did try GL_RGB8, same effect, but GL_RGBA8 does work.

I guess my question is then

What should the internal format be set to? The description is the

“number of color components in the texture”, the data describes 3… I assumed that this would be auto converted with an alpha of 1.0…

Why is the behavior different per platform… that is, why does the software renderer render one way, a hardware renderer another, and the mac the same as the hardware renderer.

For GL_RGB the format is decided by the driver you are using e.g. Nvidia AFAIK bumps the GL_RGB to GL_RGBA internally, for ATI I don’t know. What the value is set to if you call GL_RGB and it store the alpha I am assuming 1.0 but haven’t looked into it. Setup your function like this
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 16, 16, 0, GL_RGBA, GL_UNSIGNED_BYTE, mytexture);

See what happens then… You will need to allocate your array large enough for this now

so
unsigned char *mytexture = new unsigned char[16x16x4];

Ok… well I did as you said and of course, it works…

But… I think there is still a bug in the implementation and that is what I am really trying to figure out.

According to the GL Spec, GL_RGB format is supposed to be applied as follows in the GL_REPLACE environment

R[vertex] = R[texture]
G[vertex] = G[texture]
B[vertex] = B[texture]
A[vertex] = A[fragment]

Unless I am missing something A[fragment] should be the same as what I set as the alpha in glColor4f(), no?

As you can see above, I set the alpha component to 1.0f…

So I would assume that the color applied should be with alpha of 1.0f, not 0.0 as it seems like it is.

Any thoughts?

AFAIK if you are texturing glColor*f() has no effect , until you turn off the texturing mode. glColor4f() is used if you want to color your polygons, lines ect… yourself. Where as a texture you need to use

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

AFAIK if you are texturing glColor*f() has no effect , until you turn off the texturing mode. glColor4f() is used if you want to color your polygons, lines ect… yourself.

Wheter it has effect or not depends on type of the texture and texture function used.

do you know what they mean then, by “fragment color”

as per the spec:

(paraphrased a bit)

The value of TEXTURE_ENV_MODE specifies a texture function. The result of this function depends on the fragment and the texture array value. The precise form of the function depends on the base internal formats of the texture arrays that were last specified. In the following two tables, R(f), B(f), B(f), and A(f)are the color components of the incoming fragment; R(t), G(t), B(t), A(t) are the filtered texture values; R(c), G(c), B(c), and A(c) are the texture environment color values; and , , , and are the color components computed by the texture function. All of these color values are in the range [0,1]

Here is the table for GL_REPLACE and GL_RGB

R(v) = R(t)
G(v) = G(t)
B(v) = B(t)
A(v) = A(f)

From

http://www.opengl.org/documentation/spec…000000000000000

chapter 3.8.5

It seems in the Microsoft software case, the A(f) = 0

otherwise

A(f) = (some non zero)

Thanks

Originally posted by frogger1999:

But… I think there is still a bug in the implementation and that is what I am really trying to figure out.

That is quite possible. The generic renderer is not used too much so there may be problem in rarely used combination.

The alpha component is only used in texturing if you use a combiner function that needs it. Examples are modulate and any of the functions in the env combiner extension.