About glBlendFunc & multitexturing.

Hello,

I am trying to setup differents way of blending textures with multitexturing using opengl’s texture units.

first, I can blend two textures in this way using 3 TU:

glBlendFunc(GL_ONE, GL_ZERO);
       
        glClientActiveTexture( GL_TEXTURE0_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE0_ARB );
        glEnable( GL_TEXTURE_2D );
        
        glBindTexture( GL_TEXTURE_2D, terrainTexList[2].glID );
        
        
        glClientActiveTexture( GL_TEXTURE1_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE1_ARB );
        glEnable( GL_TEXTURE_2D );
        glBindTexture( GL_TEXTURE_2D, terrainTexList[1].glID );


        glClientActiveTexture( GL_TEXTURE2_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE2_ARB );
        glEnable( GL_TEXTURE_2D );
        
        glBindTexture( GL_TEXTURE_2D, terrainTexList[0].glID );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_ARB );
        glTexEnvi( GL_TEXTURE_ENV, GL_COMBINE_RGB_ARB, GL_INTERPOLATE );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE0_RGB_ARB, GL_TEXTURE1 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND0_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE1_RGB_ARB, GL_TEXTURE2 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND1_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE2_RGB_ARB, GL_TEXTURE0 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND2_RGB_ARB, GL_ONE_MINUS_SRC_COLOR );
  

In tis way textures terrainTexList[0].glID and terrainTexList[1].glID are correctly blended with terrainTexList[2].glID as t factor ( this is a grayscale texture), but mesh lighting disappears…this is logical!

So, in order to keep lighting without a lightmap I have tried to do this:

glBlendFunc(GL_ONE, GL_ZERO);
       
        glClientActiveTexture( GL_TEXTURE0_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE0_ARB );
        glEnable( GL_TEXTURE_2D );
        
        glBindTexture( GL_TEXTURE_2D, terrainTexList[0].glID );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_ARB );
        glTexEnvi( GL_TEXTURE_ENV, GL_COMBINE_RGB_ARB, GL_MODULATE );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE0_RGB_ARB, GL_PREVIOUS );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND0_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE1_RGB_ARB, GL_TEXTURE0 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND1_RGB_ARB, GL_SRC_COLOR );
        
        
        
        glClientActiveTexture( GL_TEXTURE1_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE1_ARB );
        glEnable( GL_TEXTURE_2D );
        
        glBindTexture( GL_TEXTURE_2D, terrainTexList[2].glID );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_ARB );
        glTexEnvi( GL_TEXTURE_ENV, GL_COMBINE_RGB_ARB, GL_MODULATE );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE0_RGB_ARB, GL_PREVIOUS );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND0_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE1_RGB_ARB, GL_TEXTURE1 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND1_RGB_ARB, GL_SRC_COLOR );
        
        
        glBlendFunc(GL_ONE, GL_ONE);
        
        glClientActiveTexture( GL_TEXTURE2_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE2_ARB );
        glEnable( GL_TEXTURE_2D );
        
        glBindTexture( GL_TEXTURE_2D, terrainTexList[1].glID );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_ARB );
        glTexEnvi( GL_TEXTURE_ENV, GL_COMBINE_RGB_ARB, GL_MODULATE );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE0_RGB_ARB, GL_TEXTURE2 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND0_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE1_RGB_ARB, GL_TEXTURE1 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND1_RGB_ARB, GL_ONE_MINUS_SRC_COLOR );

So in the first unit I active terrainTexList[0].glID and apply this to the mesh. If we stop here we see the terrainTexList[0].glID applied on mesh with lighting.
Then in the 2nd unit I active terrainTexList[2].glID and multiply this with the last texture applied on mesh ( terrainTexList[0].glID + lighting).
Finally I would multiply terrainTexList[1].glID with terrainTexList[2].glID inverted and apply this to the screen without replace the last screen render, so just add the last unit result to the screen. In order to this I setup the glBlendFunc to GL_ONE for the source and GL_ONE for the destination; and in my opinion the last render on the screen due to the 1st and the 2nd units would not be replaced…but when start the program I see only the third unit result on the screen!
So there is a problem in my reflexion…can you help me?
Thank you.

Noboby is inspired by this question? It is not clear understanding?

First off: Blending is separate from texturing. It always blends the result from the last texturing stage with the current content of the framebuffer (ie the results of previous draw operations).

The result in the first sample is

tex1*tex0 + tex2*(1-tex0)

the result of the second example is (if blending is enabled)

dst + color * tex0 * tex2 * (1-tex1)

.

I assume what you intended was

(tex1*tex0 + tex2*(1-tex0)) * color

which you easily get by taking the first sample and setting the second texture unit to

glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_ARB);
glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB_ARB, GL_MODULATE);

glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE0_RGB_ARB, GL_PREVIOUS_ARB);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB_ARB, GL_SRC_COLOR);

glTexEnvi(GL_TEXTURE_ENV, GL_SOURCE1_RGB_ARB, GL_PRIMARY_COLOR_ARB);
glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND1_RGB_ARB, GL_ONE_MINUS_SRC_COLOR);

Thank you for replying me!

Yes I would do this:

(tex1tex0 + tex2(1-tex0)) * color

But I don’t understand why you are using GL_PRIMARY_COLOR_ARB in the secong unit ( so GL_TEXTURE1 ?)
Because it is the texture RGB data that I want to use…not a color. In the second code example I active the texture terrainTexList[2].glID that is the grayscale texture.

I’m using the primary color because you said you wanted to include the lighting results. Those are stored in the primary color.

I use MODULATE with PREVIOUS and PRIMARY_COLOR, so the result of the first unit gets modulated with the result of the lighting stage. The first unit calculates the color of the terrain (by interpolating two color textures according to a third greyscale texture), and second unit incorporates the vertex lighting results by modulating them with the output of the first unit.

Heuuu…I don’t understand everything…

On the one hand I don’t see how you can interpolate tex0 and tex1 using tex2 as blend factor in ONE unit, because I can’t bind several textures in one unit…( or you have maybe a solution about that…)

On the other hand, I thought that the arithmetic carried out by opengl is done in this order:

framebuffer(operation)TU 0(operation) TU 1(operation)…

So it is not possible to use the lighting color data after the first unit because it is lost with the operation between framebuffer and the first unit…
And I thought that GL_PRIMARY_COLOR allow opengl to use the current color define by glColor*… no?

Thank you.

So, how did your first example work then? Did you not compute the interpolation on the first unit, using unit two and three only to bind additional textures? All I did was to extend that scenario and use the second unit also to incorporate lighting. Of course, I still need three units to hold all the textures.

Without ARB_texture_env_combine, the primary color (which is either set via glColor or generated by lighting) is indeed only available to the first texture unit. With ARB_texture_env_combine, however, it is available to all units.

Also, all framebuffer operations (blending) are done after texturing is finished.

Excuse me I was not able to answer more quicly…

If you look at the first example, you will see that I compute the interpolation In the last TU (i.e TU 2) and not in the first unit.

But I think it is very strange that “all framebuffer operations (blending) are done after texturing is finished.” Are you sure? You have probably reason because when I blend the textures in two passes it works…
It is a shame…I don’t see the interest of that.
But now I don’t understand how I can blend the textures without blending…

But I think it is very strange that “all framebuffer operations (blending) are done after texturing is finished.” Are you sure? You have probably reason because when I blend the textures in two passes it works…
It is a shame…I don’t see the interest of that.

dletozeun, it is done this way, that’s all. Such a fixed path graphic pipeline have been designed when being able to even use a single 256*256 texture was ultra c00l gr4fiX :smiley:

More details about the fixed path pipeline :
diagram

details
Blending only blends between DST-“what was on the screen” and SRC-“what you are drawing now”. Multitexture operations are done beforehand, to prepare SRC.

For a more modern use, as said memfr0b, take the texture env combine extension, it is supported all the way down to geforce2 . Have a look to the spec :
http://oss.sgi.com/projects/ogl-sample/registry/ARB/texture_env_combine.txt

If you want even more flexible stuff take the GLSL path.

Ok, thank you very much both.

I will look at these papers later, because I am currently buzy…I will keep you informed.

I red these papers quicly but there is nothing about multitexturing mechanism…I would like a paper that explain the mechanism of multitexturing between the different texture units…
I don’t stop to search and I find nothing about this subject, even on opengl.org

About GL_ARB_texture_env_combine extension, I don’t use this in the previous code? otherwise how I can use it? I don’t see the difference with GL_ARB_multitexture…

thank you.

In the original code you actually use three extensions:

[ul][li]ARB_multitexture - This lets you use multiple texture units and bind more than one texture at the same time.[]ARB_texture_env_crossbar - This lets you use textures bound to one texture unit on another. In your case, you’re using all three textures on the third unit.[]ARB_texture_env_combine - This lets you use the GL_COMBINE_ARB texture environment.[/ul][/li]
Multitexturing is pretty simple. Without multitexturing, you have a single unit, that receives a fragment color, a texel color and an environment color, and combines them in some fashion to produce a new fragment color. Subsequently, the fragment color is subject to further modification, for example by fogging, and if it isn’t discarded by any test (depth test etc), the fragment color gets written to the framebuffer (which can include blending with the previous pixel color).

With multitexturing, instead of a single texture unit, you a queue of texture units. The original fragment color is input into the first, modified, and passed on to the next, which repeats the process until the last unit is reached. After the last texture unit, fragment processing continues as before.

thanks a lot memfr0b.

Yes multitexturing is an idea simple to understand, but throught opengl is a little complicated! lol!

But I still have a problem because apparently, all texture units have to do something on the framebuffer…I explain:

What I want to do is keeping lighting. So in the first unit I do this:

glClientActiveTexture( GL_TEXTURE0_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE0_ARB );
        glEnable( GL_TEXTURE_2D );
        
        glBindTexture( GL_TEXTURE_2D, terrainTexList[1].glID );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_ARB );
        glTexEnvi( GL_TEXTURE_ENV, GL_COMBINE_RGB_ARB, GL_MODULATE );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE0_RGB_ARB, GL_PRIMARY_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND0_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE1_RGB_ARB, GL_TEXTURE0 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND1_RGB_ARB, GL_SRC_COLOR );

The texture that I have bound in the previous code is for example a grass texture.

Then, because I can’t bind several textures per texture unit, I bind the grayscale texture in the 2nd unit ( this texture is the blend factor between grass and soil texture)

glClientActiveTexture( GL_TEXTURE1_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE1_ARB );
        glEnable( GL_TEXTURE_2D );
        glBindTexture( GL_TEXTURE_2D, terrainTexList[2].glID );

But now, there is a problem, because opengl automatically modulate framebuffer with the grayscale texture that I have bound! And the grayscale, finally appears on the terrain.

Moreover even if I would not have this problem, an other problem appears during interpolation:

glClientActiveTexture( GL_TEXTURE2_ARB );
        glEnableClientState(GL_TEXTURE_COORD_ARRAY);
        glBindBuffer( GL_ARRAY_BUFFER, mapTexCoordBuffer );
        glTexCoordPointer(2, GL_FLOAT, 0, NULL);
        
        glActiveTexture( GL_TEXTURE2_ARB );
        glEnable( GL_TEXTURE_2D );
        
        glBindTexture( GL_TEXTURE_2D, terrainTexList[0].glID );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE_ARB );
        glTexEnvi( GL_TEXTURE_ENV, GL_COMBINE_RGB_ARB, GL_INTERPOLATE );
        
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE0_RGB_ARB, GL_PREVIOUS );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND0_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE1_RGB_ARB, GL_TEXTURE2 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND1_RGB_ARB, GL_SRC_COLOR );
        glTexEnvi( GL_TEXTURE_ENV, GL_SOURCE2_RGB_ARB, GL_TEXTURE1 );
        glTexEnvi( GL_TEXTURE_ENV, GL_OPERAND2_RGB_ARB, GL_ONE_MINUS_SRC_COLOR );

In this last code I bing the soil texture.
The interpolation works but lighting still disappears!? Nevertheless I use GL_PREVIOUS, so opengl could use the famebuffer result due to TU 0 and TU 1 and lighting is present after TU 1, no?

The second unit (GL_TEXTURE1_ARB) doesn’t touch the framebuffer. But, as it is still active, it touches the fragment coming out of the first unit.

The default texture environment is GL_MODULATE, so it will (probably) blend the incoming framgment with the bound texture. You have to set the unit to COMBINE with function REPLACE and input PREVIOUS, so it will simply pass on the fragment.

The third unit will interpolate the the incoming fragment color with the texture bound to itself, using the texture on the second unit as the interpolation factor. However, you’re incorporating the lighting results in the first unit, thus only modulating the texture on the first unit, and not the texture on the third unit.

Instead you have to modulate the output of the unit that’s doing the interpolation.

Thank you memfr0b.

So the 2nd unit is now:

However, you’re incorporating the lighting results in the first unit, thus only modulating the texture on the first unit, and not the texture on the third unit.
I don’t see why…can you explain it please?

Why only the texture on the fisrt unit and not 1st unit texture*lighting?
Why you speak about ‘modulating’ whereas I want to interpolate in the third unit?

Thank you for your patience.

Originally posted by dletozeun:
Why only the texture on the fisrt unit and not 1st unit texture*lighting?
Why you speak about ‘modulating’ whereas I want to interpolate in the third unit?

What he meant was that the third unit interpolates between
( TEXTURE0 * LIGHTING ) and TEXTURE2.

Thank you , but are you sure because it is precisely what I wanted to do…and it doesn’t work!

Ok, let’s start by restating the problem:

You want to texture a terrain using four input parameters: two material/color textures, a weight texture and the lighting results. The two color textures will define the color of the terrain, using the weight texture to define how much each texture affects a particular spot on the terrain. The lighting results will define the brightness of the terrain.

This means you need two operations. One being the interpolation between the two color textures, and the other being the modulation of the terrain color by the lighting results. You want the lighting to affect the terrain independently of the weighting of the color textures (a shadowed spot is dark, no matter whether it has grass texture or sand texture), thus you need to modulate the result of the interpolation, not one of its input parameters.

In pseudocode:

color = modulate(interpolate(tex0, tex1, tex2), light)
Or as a formula:
color = (tex2*tex0 + (1-tex2)*tex1) * light

You can get this result by:

Unit0
Texture: first color texture
Function: combine.interpolate(texture0, texture1, texture2)

Unit1
Texture: second color texture
Function: combine.modulate(previous, primary_color)

Unit2
Texture: weight texture
Function: combine.replace(previous)

Of course you can change the texture bindings and move the “pass-though” unit as you like. You can not, however, change the order in which the the modulation and interpolation happen.

This is exactly the problem with your current setup. You’re doing the modulation before the interpolation, thus only affecting one of the textures.

color = interpolate(modulate(tex0, lighting), tex2, tex1)
color = tex2*(tex0*lighting) + (1-tex2)*tex1

See the difference?

thanks thanks a lot!!! memfr0b :smiley:

Yes I see the difference between:

color = interpolate(modulate(tex0, lighting), tex2, tex1) ( what I was doing)

and:

color = modulate(interpolate(tex0, tex1, tex2), light)

I have tried to do interpolate before modulate and you have reason, it works!

But sincerely, I don’t see why modulation before intermolation don’t work…
Because In my head I see:

1st: tex0*lighting (so there is in this unit the lighting)

2nd: (tex0*lighting)tex2 + tex1(1-tex2) as color

It is as if the lightmap was already in the tex0 texture…I don’t see why the lighting would disappears at this moment…

Originally posted by dletozeun:
[b] 2nd: (tex0*lighting)tex2 + tex1(1-tex2) as color

It is as if the lightmap was already in the tex0 texture…I don’t see why the lighting would disappears at this moment… [/b]
If tex2 does have value 1, the equation will yield:

(tex0 * lighting ) * 1 + tex1 * 0 = tex0 * lighting

If tex2 does have value 0, the equation will yield

(tex0 * lighting ) * 0 + tex1 * 1 = tex1