PDA

View Full Version : Multiply 2 tex /w ARB_multitexture



Frumpy
04-27-2001, 12:10 PM
After trying for the last two weeks I have only a white screen with a faint outline of my image to show for my attempts at multipling two textures together with ARB_multitexture. #^$%!!! I originally multiplied the two together using multipass like this:

// bind diffuse texture
glBindTexture( GL_TEXTURE_2D, _texName[0] );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE );
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0,
GL_BGRA_EXT, GL_UNSIGNED_BYTE, pDiffuseImage );

// bind bump texture
glBindTexture( GL_TEXTURE_2D, _texName[1] );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );
glTexImage2D( GL_TEXTURE_2D, 0, GL_ALPHA8, width, height, 0,
GL_ALPHA, GL_UNSIGNED_BYTE, pBumpImage );

> ... <

// Pass 1: diffuse
glBindTexture( GL_TEXTURE_2D, _texName[0] );
glCallList( quad );

// Pass 2: bump map
glBlendFunc( GL_ZERO, GL_SRC_ALPHA ); // thank you j
glEnable( GL_BLEND );
glBindTexture( GL_TEXTURE_2D, _texName[1] );
glCallList( quad );
glDisable( GL_BLEND );
Perfect.

However when I converted the second chunk of code to do multitexture like this:
glActiveTextureARB( GL_TEXTURE0_ARB );
glEnable( GL_TEXTURE_2D );
glEnable( GL_BLEND );
glBlendFunc( GL_ONE, GL_ZERO );
glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE );
glBindTexture( GL_TEXTURE_2D, _texName[0] );

glActiveTextureARB( GL_TEXTURE1_ARB );
glEnable( GL_TEXTURE_2D );
glEnable( GL_BLEND );
glBlendFunc( GL_ZERO, GL_SRC_ALPHA );
glBindTexture( GL_TEXTURE_2D, _texName[1] );
glTexEnvi( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE );

glBegin( GL_QUADS );
glColor3f( white );
glNormal3f( 0.0f, 0.0f, 1.0f );
// four sets of these in counter-clock wise order
glTexCoord2fv( k_TexCoord[n] );
glMultiTexCoord2fvARB( GL_TEXTURE1_ARB, k_TexCoord[n] );
glVertex2fv( k_VtxCoord[n] );
glEnd( );
// Disable blending & texturing for tex units 0 & 1.
Result: white screen and a faint image.

Lights off. Backface culling off. Depth testing off. Alpha testing off. ARB_multitexture extension confirmed supported on my Geforce MX under Win2000 o/s.

Please help. How can I do this without using any other extensions?

Korval
04-27-2001, 12:44 PM
OK, I see part of your problem.

You call glActiveTextureARB with GL_TEXTURE1_ARB. Later, you call glTexCoord. I'm sure you expected this call to tell OpenGL what texture coordinates you wanted for the first texture. Unfortunately, OpenGL still thinks you are referring to GL_TEXTURE1_ARB.

To make sure which texture coordinates you are referring to, I would suggest you always use glMultiTexCoord.

The other part of your problem lies in multi-texture blending. Honestly, I don't know how to set it up properly, but I do know that it isn't set up with the standard glBlendFunc command. That command is for alpha-blending. There is probably either a command for it in the multitexturing extension, or it is part of the glTexEnv command. Or, look into the EXT_texture_env_combine extension.

DFrey
04-27-2001, 01:09 PM
To multiply two textures by way of multitexture, you only have to set the first unit to GL_REPLACE and the second unit to GL_MODULATE. No blending is necessary as blending uses the final output of the texture units, it does not work on each unit seperately.

[This message has been edited by DFrey (edited 04-27-2001).]

Frumpy
04-27-2001, 06:11 PM
Originally posted by Korval.always use glMultiTexCoord

OK I've swapped all calls to glTexCoord with glMultiTexCoord for tex unit 0. Still a white screen and a faint diffuse image. <deep sigh>

The extension EXT_texture_env_combine seems to be the solution as it was designed to do (texA * texB)+texC. Unfortunately the extension is not supported by my video card. I believe the lack of support is because three texture units are required and my Geforce MX has only two texture units.


[This message has been edited by Frumpy (edited 04-27-2001).]

Frumpy
04-27-2001, 06:24 PM
Originally posted by DFrey:
No blending is necessary as blending uses the final output of the texture units, it does not work on each unit seperately.

I guess by that you meant remove the blending code. Without blending code the program displays only the diffuse bitmap unbumped. (The multiply did not happen.)
If both textures are RGB then setting the second tex unit to GL_MODULATE will cause a multiply. However my bump may is of format GL_ALPHA and OpenGL probably just didn't know what to do with it.

DFrey
04-27-2001, 06:44 PM
Ah yes, since one of textures is an alpha texture (a detail I missed earlier), you would need to use the combine extension (or something with similiar functionality). And your Geforce2 MX should very well support the combine extension (even my lowly TNT supports that). The three terms of the combine function do not imply the need for three texture units. Each term is programmable. Multiplying one texture's RGB by another's alpha can be done with either the modulate or interpolation function of the combine extension.

Frumpy
04-27-2001, 07:30 PM
Originally posted by DFrey:
Geforce2 MX should very well support the combine

You're right my MX does support EXT_texture_env_combine. I've checked my two o/s (win2000 & win98) and according to glGetString( GL_EXTENIONS) win2000 does not support combine but win98 does. ??? Unfortunately I'm developing in Win2000.

I'm using nVidia driver 6.50.

DFrey
04-28-2001, 02:59 AM
win2000 does not support combine
I'm stuck working in Win98 so I wouldn't begin to know why not. You might want to bring this topic up on the advanced coding board so that Matt or Cass (who are NVIDIA driver developers) can address the issue.

Frumpy
04-28-2001, 10:32 AM
Sorry my mistake Win2000 does support EXT_texture_env_combine. When I checked for the extension I only did an eye ball search on the string returned by glGetString(GL_EXTENSIONS) in my debugger watch window. Unfortunately the watch window only displays the first 256 characters of a string and EXT_texture_env_combine was much further down.