I just made some tests to be sure. It seems than my GeForce 3ti200 works as expected.
How on earth can you tell the difference between NEAREST and LINEAR for the minification ??? It is really subtle. And why would you actually need that ?
Maybe changing the ‘image settings’ to ‘quality’ the card will follow more precisely your indications.
You will see much more aliasing in the range just after minfication begins, in that area it makes a big difference. Under some circumstances it is obvious and easy to reproduce for example when draiwng sprites < 1:1.
I just tried it out on someone else’s computer, also using an FX card, and it didn’t work. I’m probably just gonna stick with GL_LINEAR for both MIN and MAG. Thanks for the help, guys.
Are you sure that there is no bug in your program? I had that once… calling these functions not after binding a texture, but somewhere in the startup code, of course that didn’t work. There definitely is no bug in fx hardware or driver regarding these functions (at least as far as I can say).