View Full Version : GL_TEXTURE_ENV_COLOR BUG ???

07-03-2003, 08:56 AM
Sorry to disturb you, but OpenGL GL_COMBINE_ARB drive me crazy !!!
There is my problem, I have disable lighting, I render a mesh with 3 textures units enabled :
- first = my normal map,
- second = a texture without any pixels, only the texture_env_color is set to the desired value (glTexEnvfv(GL_TEXTURE_ENV,GL_TEXTURE_ENV_COLOR,af Blend) http://www.opengl.org/discussion_boards/ubb/wink.gif
- third = a texture without any pixels, only the texture_env_color is set to the desired value (glTexEnvfv(GL_TEXTURE_ENV,GL_TEXTURE_ENV_COLOR,af Blend) http://www.opengl.org/discussion_boards/ubb/wink.gif,

for the second channel I use the GL_COMBINE_ARB value for the GL_TEXTURE_ENV_MODE target and the following values for sources and operands :


for the the third channel I use the same technics, but GL_ADD instead of GL_MODULATE.

My problem is that the final result do not take into account my 2 blendColor, so, I have a nice PerPixel&Bump lighting but without the diffuse and the ambient components. It is a little bit annoying.

For me these 2 lines specifie the color to modulate/add to the previous computed fragment : glTexEnvi(GL_TEXTURE_ENV,GL_SOURCE0_RGB,
Am I wrong ?

If anybody could help me ... Thanks a lot.

07-03-2003, 11:00 AM
If the texture object bound to the first texture unit is the normal map, does that mean that you are computing the per-pixel lighting into register combiners or fragment shaders ?

Apart from that, have you enabled texturing for TEXTURE1_ARB and TEXTURE2_ARB ?

07-03-2003, 12:47 PM
It's a requirement that you bind textures to each texture environment. Even if you never sample them, you still need to bind (dummy) textures.

07-04-2003, 05:18 AM
For Vincoof :
I compute the ppl with fragment shader, I do not use register combiner, because I want to be the most graphic cards compliant as possible (not only NVidia card).
And yes I have enabled the 2 texture units.

For zeckensack :
By the sentence "It's a requirement that you bind textures to each texture environment." do you mean that I had made a call to the glBindTexture(GL_TEXTURE_2D,pTexId) command before the call to the glTexEnvfv(GL_TEXTURE_ENV, GL_TEXTURE_ENV_COLOR, afBlend) one ?
If this is the meaning of your answer, yes I do it.

07-04-2003, 06:04 AM
If you're trying to use the texture environment functions of TEXTURE1_ARB and TEXTURE2_ARB at the same time you enabled fragment programs, I guess you're out of luck.

Fragment programming completely overrides the texturing stage, as well as the color sum and fog stages. In other words, you can not use simultaneously, for instance, a fragment program for the texture unit #0 and standard texturing for the texture units #1 and #2.
You will have to perform your scale and bias in the fragment program. Fortunately it can be done in one instruction only (the famous MAD operation). Unfortunately, I'm not sure if the fragment program can use multiple texture environment colors in a single operation. I know that old nVidia hardware did have limitations about this when using register combiners.

As for what zeckensack wrote, it's about the stability of the texture unit. Sure you don't sample texels on texture unit #1 and #2, but that does not mean that you can bind invalid texture objects on thos texture units.
You wrote :
"- second = a texture without any pixels, only the texture_env_color is set to the desired value,
- third = a texture without any pixels, only the texture_env_color is set to the desired value, "
But in your case what is a "texture without any pixels" ? If that texture is not valid, chances are the result will be undefined for those texture units.

Sorry for the long post. hth

07-04-2003, 08:34 AM
You're saying fragment shader, but you're also saying TexEnv. The two are mutually exclusive. You don't get one if you use the other.

To enable the texture environment operation (for GL_COMBINE) you have to glActiveTexture() to that texture unit and glEnable( GL_TEXTURE_xD ). You can leave texture object 0 bound if you want; just make sure it's enabled. For clarity, you might want to bind a 1x1 pixel white texture.

07-07-2003, 01:43 AM
Thanks a lot for you help : in fact the problem was that I had not created a valid texture. It seems that for OpenGL, a valid texture must have at least 1 pixel.
For the Shader vs register vs COMBINE question, in fact i had reply that i use pixel shader, because for me the use of the GL_COMBINE means that I use some "pre-computed" pixel shader, but maybe I wrong, sorry for the misunderstanding.

THANKS gentlemen, now it works http://www.opengl.org/discussion_boards/ubb/smile.gif

07-07-2003, 10:25 AM
From the Holy specifications, chapter 3.8.1 (Texture Image Specification), p 126 :
"An image with zero width, height (TexImage2D and TexImage3D only), or depth (TexImage3D only) indicates the null texture. If the null texture is specified for the level-of-detail specified by texture parameter TEXTURE_BASE_LEVEL (see section 3.8.4), it is as if texturing were disabled."

So, yes you have to specify a non-null image otherwise the texture unit is silent.