Problem changing pixmap font colour

I’ve got a strange problem. Here’s what I’m doing. I have a set of anti-aliased characters (white text on black background) stored as individual glyphs in display lists.
This works very well for drawing them, but I want to change the colour arbitrarily when I draw them so I can draw text in red, then grey, then blue and so on without having to rebuild the images.

I do this by using
glPixelTransferf(GL_RED_SCALE, red);
glPixelTransferf(GL_GREEN_SCALE, green);
glPixelTransferf(GL_BLUE_SCALE, blue);

which works very well.

(I saw in this thread ( http://www.opengl.org/discussion_boards/ubb/Forum3/HTML/005739.html ) a way of doing it using GL_*_BIAS which I might try as well, although it may not be suitable for anti-aliased text. )

The problem is that on a colleague’s machine (both running Windows) the red and blue components are transposed. So red text is blue and vice versa, but green text is fine.

Any idea what may cause this? I suspect a bug in his OpenGL implementation, and have advised him to update his device drivers, but could it be anything else?

Are you setting the Pixel transfer modes and then using glTexImage2D? Or are you using glDrawPixels()?

If you are using glTexImage2D() then try using glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE) and just vary the colour of your quads.

If you are using glDrawPixels - I would guess that you should switch to using Textured Quads (I don’t use glDrawPixels but I think it would be quite a bit slower than using Quads).

[EDIT] - Ah I see you mentioned display lists so I guess you are applying your Pixel Transfer Bias to glTexImage2D…

[This message has been edited by rgpc (edited 05-02-2003).]

No, I’m using drawpixels.
Each character is rendered into a small image with an alpha mask to maintain the anti-aliasing of the original character and then compiled into a display list.

Why would textured quads be faster than drawpixels? Is it because the texture is stored on the graphics card rather than system memory? I am potentially going to be using a lot of fonts of different sizes which all need to be available very quickly (so I don’t want to risk blowing the texture memory), and I need to ensure that they are pixel-perfect representations of the originals.

(It’s also intended for use on any OpenGL system, not a specific implementation.)

But my main concern is why the red and blue components are transposed on one system and not another.

Originally posted by Adrian67:
Why would textured quads be faster than drawpixels? Is it because the texture is stored on the graphics card rather than system memory?

Exactly. If you use glDrawPixels() to draw a single character of 16x16 pixels using rgba32 you are sending 16x16x4 bytes across the AGP bus. If you use a quad you are sending 4x2 bytes acros the bus. Granted there is a small overhead of switch to and setting up glOrtho() but that’d be negligable.

I am potentially going to be using a lot of fonts of different sizes which all need to be available very quickly (so I don’t want to risk blowing the texture memory), and I need to ensure that they are pixel-perfect representations of the originals.

I can’t think of an obvious use for this other than to send someone into convulsions. Pixel perfect is find when using quads that match the size of your characters. Various sizes of fonts can ensure readability on different resolutions. But having all your fonts, with all the different sizes loaded does not have an obvious use to me. You need to look at whether you really need them loaded all the time (and if they would risk swapping the textures).


(It’s also intended for use on any OpenGL system, not a specific implementation.)

Quads are pretty standard on most systems - they even have four sides in DirectX (although Micro$oft will probably increase that to 5 with Dx 10 - just to be one better)


But my main concern is why the red and blue components are transposed on one system and not another.

I would guess it has something to do with the way you pass your data to glDrawPixels(). It might be that you are using GL_BGRA and the problem card doesn’t support BGRA - and it handles it incorrectly. (In the beginning there was BGRA, then ARB said “let there be RGBA_EXT”, and the ARB saw that it was good…)

I would guess it has something to do with the way you pass your data to glDrawPixels(). It might be that you are using GL_BGRA and the problem card doesn’t support BGRA - and it handles it incorrectly.

I am using GL_BGRA, but I also draw normal images in the same code using drawpixels and GL_BGRA. It’s only when I use glPixelTransferf that the colour components are swapped (fine on my Windows XP machine, wrong on my colleague’s Win2000 machine).