texture filtering problem

Hello, I’ve been trying to do OpenGL filtering with GL_LINEAR for MIN and GL_NEAREST for MAG:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);

But it shows both as GL_NEAREST.

On the other hand, if I use GL_NEAREST for MIN and GL_LINEAR for MAG, or GL_LINEAR for both, they work as expected.

Is this a hardware issue? Does anyone have an idea of what’s wrong?

Thanks.

It’s probably just a driver bug but it could be hardware related. Either way it’s an OpenGL bug.

Is this a well-known bug? Has anyone else experienced this? I’m using FX 5200.

I just made some tests to be sure. It seems than my GeForce 3ti200 works as expected.

How on earth can you tell the difference between NEAREST and LINEAR for the minification ??? It is really subtle. And why would you actually need that ?

Maybe changing the ‘image settings’ to ‘quality’ the card will follow more precisely your indications.

You will see much more aliasing in the range just after minfication begins, in that area it makes a big difference. Under some circumstances it is obvious and easy to reproduce for example when draiwng sprites < 1:1.

That is right, thank you dorbie.

I just tried it out on someone else’s computer, also using an FX card, and it didn’t work. I’m probably just gonna stick with GL_LINEAR for both MIN and MAG. Thanks for the help, guys.

Are you sure that there is no bug in your program? I had that once… calling these functions not after binding a texture, but somewhere in the startup code, of course that didn’t work. There definitely is no bug in fx hardware or driver regarding these functions (at least as far as I can say).