Texture filers on ATI

I have a problem that renders a piece of transparent texture onto a scene. The texture itself is a smooth shaed circle, changes from yellow at the center to white to further parts.

I tried the program on an ATI 9700 Pro, with driver of Catalyst 3.4. The color depth is 32 bits. The texture is rendered correctly, except that it looks as if it were rendered under 16 bits, where the color seems changing in ‘steps’, instead of changing smoothly. At this point I’m having ‘Balanced’ under the ATI control panel -> OpenGL settings. After changing the settings to ‘High Quality’, the texture is rendered as I expected.

I also tested the program on GeForces (4MX and Ti4200), and it works as I expected without changing any driver settings.

I’m justing wondering whether this is the driver’s problem or my problem. Is there anything I can set in my program to, say, let’s have the texture quality set to be high every time when my program executes? Or I should go into writting my own texture filter? I know I can always ask the user to setup the driver by themselves, but I would like to have it done automatically.

Forgot to mention - This is the code I used to load the texture.
Tried both glTexImage2D and gluBuild2DMipmaps, but seems not much difference to me.

//gluBuild2DMipmaps(GL_TEXTURE_2D, 4, width, height, GL_RGBA, GL_UNSIGNED_BYTE, textureData);
glTexImage2D(GL_TEXTURE_2D, 0, 4, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, textureData);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);

What driver problem? You just said you had to adjust the quality in the panel.

AFAIK, the control panel is there so that the user has absolute control over what the card does.

For example, I set the texture anisotropy to max and I dont want a program to override what I want.

Originally posted by dominickwan:
glTexImage2D(GL_TEXTURE_2D, 0, 4, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, textureData);
Substitute for GL_RGBA8.

Its about time that someone removed all old opengl information that still states that argument as “Components” instead of what it really is, that is “InternalFormat”.

Agreed, this one comes again every month or so :slight_smile:

Y.