Odd results when using anisotropic filtering

Hi - I’ve just started learning OpenGL and have got round to texturing, but have encountered an odd problem with anisotropic filtering.

I’m using a very basic mipmap using plain colours (level 0 is white, level 1 red and so on) to help me see what’s going on. All examples are just a rectangle rotated into the scene.

No filtering (fine):


glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST);

Filtering, isotropic (again, all good):


glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);

And with 4x anisotropic filtering (something wrong here):


glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_LINEAR);
glTexParameterf (GL_TEXTURE_2D, GL_TEXTURE_MAX_ANISOTROPY_EXT, 4.0f)

As far as I can tell that last line is all that’s needed besides checking for the extension. Am I missing something?

Environment:
Linux
Mesa DRI R200
ATI Radeon 9200

Thanks in advance for any help!

That is correct, but some things should be known about AF:

The most cards don’t filter AF with a trilinear filter. That is the reason for the mipmap banding.

The calculated mipmap level is not the same like for isotropic filtering, with a 4x AF the sampled texel have only 1/4 of the size of a isotropic filtered texel (all are aligned). For more information search for “Footprint Assembly”

Check instead with a mipmapped black and white checkboard.
Your artificial current mipmap levels are designed to test trilinear filtering, not aniso.

Erm, that is untrue. Aniso should work absolutely fine with trilinear filtering.

Your use-case is indeed more suited to test trilinear filtering, but as far as i can see, you are not doing anything wrong, so it should work correctly.

Mesa DRI is a software-renderer, no? Can you try with hardware-acceleration? Or on some other PC. Maybe it is indeed a bug.

Jan.

Unfortunately that will be done with highest image quality settings (in the driver panel). Specially older cards “optimize” the filtering by using bilinear insteat trilinear filtering.

The Radeon 9200 doesn’t support trilinear aniso.

Thanks for all the responses.

Mesa is the implementation of OpenGL on Linux. DRI indicates the HW accelerated path.

That would explain it! So it’s dropping to bilinear when AF is turned on. I’ll try again with a checkerboard pattern and stick to MIPMAP_NEAREST then.

Not much I can do about this I suppose, besides upgrade my ancient card! Although the Mesa software renderer might support trilinear+AF, so I’ll see what happens when I turn DRI off.

You can leave MIPMAP_LINEAR enabled. Your card will do the fallback, but when running on other hardware, it will look better (as intended).

Jan.

Ah, this confirms what’s going on.

Trilinear, no AF

(not sure what’s going on in the distance - some other issue)

4x AF enabled with MIPMAP_LINEAR:

AF is clearly working, and is just using bilinear filtering.

I’m starting to understand this stuff a bit more :slight_smile:

Thanks again everyone.

As a final advice, your test texture will be even more useful with more squares, like this :

http://www.hwupgrade.it/articoli/skvideo/1361/immagini/6800/anisotropico/testfilter/ogl/ononon.jpg

While we’re discussing AF, even on a GeForce 8 we see lots of temporal “sparkling” at medium-to-high levels of aniso (e.g. 4+), particularly when there are high frequencies in the data (light/dark, similar to checkerboard test). This is when, say, eyepoint is 5-30 deg up from the aniso’ed poly plane, looking down at ~20-30 deg.

Yes, just like the mags, it looks fine with a static scene, but when you’re moving toward or away from it, it’s really nasty. Moving laterally, it’s fine. Also, rolling the eyepoint 45 deg (i.e. bank the eyepoint) before moving forward/back greatly reduces the artifacts, but does not eliminate them. Card is an 8800 GTX.

First of all, what is this? Is this “brilinear” being applied to AF? Is this just unavoidable inaccuracies in the AF “integration” of all the texel data?

Also, what are folks doing about it. Blurring your textures? Special filters applied when generating MIPmaps that reduce the problem? Some driver tweak to disable bilinear/brilinear cheats?

Any pointers to worthwhile write-ups on this would be appreciated.

“Brilinear” is not a technical term. It was coined by the website, that you have linked to.

What they mean with the term is, that drivers cheat. You choose trilinear filtering, but the driver reduces quality to speed things up. Instead of always looking up two mipmap levels and then always blending accurately, only in a very small segment two levels are blended, but usually only one level is accessed. This makes texture-access and filtering much less costly, but reduces the quality of all textures, that were supposed to use proper trilinear filtering.

Today most drivers have a “high-quality” mode, that you can enable in the settings dialogs. “High-quality” usually means, that such unfair tricks are definitely not used. As long as you don’t check that box, drivers do all sorts of things, that might reduce quality for the sake of higher performance.

Jan.

Perhaps not, but my intent was to convey the concept without having to write a paragraph explaining what it was. The term+URL does that. The term has become fairly ubiquitous over the last 4 years.

Today most drivers have a “high-quality” mode, that you can enable in the settings dialogs. “High-quality” usually means, that such unfair tricks are definitely not used.

Yeah, I’ve read that. I’d like to try it, but…

…what I don’t know is if nVidia or ATI provide this texture filtering “quality toggle” in their Linux drivers, and where it is. Does anyone know?

Nvidia prioprietary drivers for Linux have it, even if it is less detailed than under windows where you can tweak each perf-cheat separately such as trilinear or aniso.

Just for the archives, the solution under Linux (latest drivers) is to run nvidia-settings, go to “X Screen 0 -> OpenGL Settings”, and move “Image Settings” from “Quality” (default) to “High Quality”. The aniso texture filter sparkling isn’t totally cured, but it’s greatly reduced!

nvidia-settings also supports batch-loading of the nvidia-settings saved config file, so it can be set on X server startup fairly simply via xsession or xinitrc.