I created a demo for distance field contour texturing (Green’s method from Siggraph 2007), but ran into a strange bug. ATI hardware behaves like expected, but on Nvidia hardware, the dFdx() and dFdy() functions, and the related fwidth(), behaves erratically.
(Windows, Mac and Linux versions are available.)
The strange and nasty artefacts occur when the textured plane is rotated out of the screen plane, using ctrl-vertical mouse drag in the demo.
Any ideas what causes this? I would love to present a demo that works on both platforms, and I see no reason why it shouldn’t work. What am I doing wrong? The problematic shader is in the file “fragment1.glsl” in the demo archive.
I experienced differences in dFdy() on ATI vs NVidia. The latter did it right according to the spec, while ATI did reverse. I’m not using this derivatives any more…
Toyed a bit with the glsl code, couldn’t find ways to fix it; the dFdy’s sign doesn’t matter here naturally.
I suggest you make a fixed-case scenario (a slight fixed tilt, image 4 getting loaded) and debug via RGBA32f render-targets.
Thanks for the suggestions. Pity that the auto derivatives are wrong on ATI, but as I said, this problem is with Nvidia hardware, and as pointed out above, the sign doesn’t matter here. I guess I’ll have to look closer into what actually happens. My main problem for debugging is I don’t have easy access to Nvidia hardware.
Anisotropic analytic antialiasing is all about auto derivatives. I see no alternative to using it. Oh, well.