Textures don't work on some graphic cards

It’s me once again, with a question resulted from my attempts to get texture mapped fonts to work on various platforms and configurations.

Inspite of all my efforts, at least on some notebooks (like DELL PRECISION) with mobile nVIDIA graphic cards under Windows XP and RedHat Linux 8 I get rectangles filled with the current color instead of character glyphs, or rectangles filled with “current-colored” and white horizontal stripes (assuming that the background color is white).

One more example of problematic configuration: graphic card S3 SAVAGE/IX (driver version 5.12.1.7062 of 20/03/2001) under MS Windows 2000 Pro.

I failed to resolve the problem by updating the graphic driver: in my cases, it is almost impossible to find an updated version of graphic driver.

I can make the text visible (for testing purpose) by specifying “internal_format” and “format” arguments for glTexImage2D() function different from GL_INTENSITY and GL_LUMINANCE_ALPHA - for instance, GL_RGBA. However, I cannot change text color in this case.

Probably, there is an alternative and more reliable combination of texture parameters that would allow to draw texture-mapped fonts (with transparent empty pixels between glyphs and a possibility to change font color without re-building the texture) - I would be extremely thankful for any suggestions.

Some code excerpts (nothing particular in fact - it works well on most Win2k/WinXP/RedHat 8 stations):

// 1. Computing necessary geometrical font parameters,
// preparing a font image in “pixels” variable
// For each pixel, two bytes - luminance and alpha - are stored:
// both zero for empty pixels and 0xff for image pixels
char* pixels = new char[myTexFontWidth * myTexFontHeight * 2];
// …

// 2. Creating a texture with character glyphs
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glGenTextures(1, &myTexFont);
glBindTexture(GL_TEXTURE_2D, myTexFont);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, myMinMagFilter);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, myMinMagFilter);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_INTENSITY,
myTexFontWidth,
myTexFontHeight,
0,
GL_LUMINANCE_ALPHA,
GL_UNSIGNED_BYTE,
pixels);

// …

// 3. Using the texture to display text from
// “theStr” variable
double aXScale = 1., aYScale = 1.;
// store attributes
glPushAttrib( GL_ENABLE_BIT | GL_TEXTURE_BIT );

if ( !myIsResizeable )
{
glGetDoublev (GL_MODELVIEW_MATRIX, modelMatrix);
aXScale = modelMatrix[0];
aYScale = modelMatrix[5];
}

glEnable(GL_TEXTURE_2D);

glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, myMinMagFilter);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, myMinMagFilter);

glPixelTransferi(GL_MAP_COLOR, 0);

glAlphaFunc(GL_GEQUAL, 0.05F);
glEnable(GL_ALPHA_TEST);

glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA);

glBindTexture(GL_TEXTURE_2D, myTexFont);
glBegin(GL_QUADS);

if ( myIsResizeable && theScale > 0. )
{
aXScale = aXScale / theScale;
aYScale = aYScale / theScale;
}

// aFontHeight contains the font height in pixels
// coming from font metrics
theY = theY - ( myTexFontHeight - aFontHeight ) / aYScale;

double aLettBegin, aLettEnd, aDY = ( myTexFontHeight - 1 ) / aYScale, aDX;
char aLetter;
int aLettIndex;

for( int i = 0, l = strlen( theStr ); l > i; i++ )
{
aLetter = theStr[i];
aLettIndex = (int)aLetter - FirstSymbolNumber;

aLettBegin = (double)myPositions[aLettIndex] / ( (double)myTexFontWidth - 1. );
aLettEnd = aLettBegin + ( (double)myWidths[aLettIndex] - 1. ) / ( (double)myTexFontWidth - 1. );

aDX = ( (double)myWidths[aLettIndex] - 1. ) / aXScale;

glTexCoord2d( aLettBegin, 0.0 );
glVertex3d( theX, theY, 1.0 );
glTexCoord2d( aLettBegin, 1.0 );
glVertex3d( theX, theY + aDY, 1.0 );
glTexCoord2d( aLettEnd, 1.0 );
glVertex3d( theX + aDX, theY + aDY, 1.0 );
glTexCoord2d( aLettEnd, 0.0 );
glVertex3d( theX + aDX, theY, 1.0 );

theX += aDX + mySeparator / aXScale;
}

glEnd();
// restore attributes
glPopAttrib();

Best regards,
Sergey

I don’t know why some implementations don’t work for you but there are some things to be corrected in the code.

  • Check if your textures are power-of-two size or older hardware won’t work.
  • I assume minification filtering is never using mipmaps or this won’t work.
  • You should not use GL_CLAMP as wrap mode. Some hardware doesn’t support that. GL_CLAMP_TO_EDGE won’t filter the texture border color.
  • TexParameters are per texture object.
    You set it once when downloading the texture, glBindTexture recalls it. No need to set it in step 3 at all (esp. before the glBindTexture).
  • In the given code the texture’s internalFormat is GL_INTENSITY. But you load from GL_LUMINANCE_ALPHA user data. That doesn’t retain the texture’s alpha channel. Modulate multiplies A = Af * It. Ok, that’d work.
  • LUMINANCE_ALPHA with GL_MODULATE multiplies the current alpha with the texture’s alpha. make sure the current color has alpha != 0.0.
  • It looks strange that all your vertices are at z = 1.0. Normally font rendering is 2D (at z = 0.0). Check your projection.
    Performance:
  • Don’t use doubles in the OpenGL calls, use the float versions.
  • Move the second myIsResizable block into an else clause of the first.

Originally posted by Relic:
[b]I don’t know why some implementations don’t work for you but there are some things to be corrected in the code.

  • Check if your textures are power-of-two size or older hardware won’t work.
    [/b]
    Sure, they are.

[b]

  • I assume minification filtering is never using mipmaps or this won’t work.
    [/b]
    No, mipmaps are not used for minifcation.

[b]

  • You should not use GL_CLAMP as wrap mode. Some hardware doesn’t support that. GL_CLAMP_TO_EDGE won’t filter the texture border color.
    [/b]
    This is interesting - I will check, thanks!

[b]

  • TexParameters are per texture object.
    You set it once when downloading the texture, glBindTexture recalls it. No need to set it in step 3 at all (esp. before the glBindTexture).
    [/b]
    Oops! Haven’t known this before.

[b]

  • LUMINANCE_ALPHA with GL_MODULATE multiplies the current alpha with the texture’s alpha. make sure the current color has alpha != 0.0.
    [/b]
    Sure, it has non-zero alpha :slight_smile:

[b]

  • It looks strange that all your vertices are at z = 1.0. Normally font rendering is 2D (at z = 0.0). Check your projection.
    [/b]
    Yes, it’s strange, a sort of tradition rather than something logical…I agree that z = 0. would look more straightforward.

[b]
Performance:

  • Don’t use doubles in the OpenGL calls, use the float versions.
    [/b]
    Thanks, this is a good advice to follow.

BTW, does anybody know if S3 SAVAGE IX includes any hardware acceleration of texturing? The problem is that performance is far not good on this card…and I use textures intensively in my application, there are several dozens 64x64 textures loaded and displayed simultenously, apart from few larger textures used for texture-mapped fonts. There’s no performance problems on e.g. nVIDIA GeForce FX5600.

Anyhow, I’m still confused with the fact that replacing “canonical” format used for texture-mapped fonts (GL_INTENSITY + GL_LUMINANCE_ALPHA) with GL_RGBA makes characters visible…probably, it’s possible to achieve the same OpenGL output using GL_RGBA plus some combination of other texture parameters…

The other reason for problems I could imagine is that the mismatch of user input data and internalFormat could not be correctly done inside some implementations.
Check if the problem is also visible when the input data and internalFormat match in the glTexImage call.
Try to use the more modern internalFormat enums which also specify the internal precision, GL_INTENSITY8 and GL_LUMINANCE8_ALPHA8 in your case.
Running out of ideas now.
Do you have a reproducer to download somewhere?

Originally posted by Relic:

Do you have a reproducer to download somewhere?

No - it’s a part of big application, and I’m not entitled to distribute it, sorry!
If I have some time to build a small executable for testing, I’ll definitely inform you.
Anyhow, thanks for sharing your ideas!

You might want to try some glGetError() calls too. This might tell you what is going wrong.

There is easy check for that:

Run the DirectX Caps Viewer, (C:\Program Files\Microsoft DirectX SDK (December 2005)\Utilities\Bin\x86 if you have DXSDK installed)

Look at

DirectX Graphics Adaptaters/<your adapter>/D3D Device Types/HAL/Adapter Formats/D3DFMT_X8R8G8B8/Texture Formats/

Look for D3DFMT_A8L8. If you have ‘No’, then this format is not supported, and probably the result is undefined in OpenGL.

Originally posted by execom_rt:
[QB]There is easy check for that:

Run the DirectX Caps Viewer, (C:\Program Files\Microsoft DirectX SDK (December 2005)\Utilities\Bin\x86 if you have DXSDK installed)
QB]
Is there anything similar that runs on Win2k SP3? Do you know whether DX9.0 SDK (older SDK version that supports Win2k) include this diagnostic tool?

Originally posted by execom_rt:
Look for D3DFMT_A8L8. If you have ‘No’, then this format is not supported, and probably the result is undefined in OpenGL.[/QB]
No, functionality like GL_LUMINANCE_ALPHA is in the OpenGL core since the first version and OpenGL core functionality is granted to work.
Agreed, if a HW does not support that natively, which could be seen with the DX caps, then the OpenGL implementation on that chip would have to emulate that. I wouldn’t put my wagers on S3 in that case. :wink:

Here:
DirectX 9.0c SDK Summer 2004

This one works with Windows 2000.

Or this link

Doing the same thing (maybe smaller to download)

Bingo!!! The problem resulted from font texture width (2048) that exceeded maximum supported texture dimensions (1024x1024) for S3 card…most likely it’s the same for other problematic configurations.

I gonna use glGet(GL_MAX_TEXTURE_SIZE) and adjust the font texture dimensions (by splitting it into several character rows) to avoid textures which are too wide or too high.

I’m sorry for misleading you - replacing GL_INTENSITY + GL_LUMINANCE_ALPHA with GL_RGBA helped to solve a different bug…and obviously the texture size was smaller in that case…