Getting specific LoD or bias with texture2DGrad()

hello

i have to sample a texture inside a loop, which of course gives artifacts, as detailed here: http://www.opengl.org/pipeline/article/vol001_5/

the workaround is calculation of the gradients before the loop and using texture2DGrad() in the loop instead of texture2D().

the thing is, i am using either the bias argument to texure2D() or texture2DLod() with a specific lod to pick.

now, texture2DGrad() doesn’t accept an lod or bias argument, because the gradients already have this information.

so the question is how do i have to scale the gradients to get a specific LoD or bias?


vec2 dPdx = dFdx(myTC);
vec2 dPdy = dFdy(myTC);


texture2DGrad(myStripeMap, myTC, dPdx, dPdy); // how to force a) a specific lod and b) a specific bias here?

thanks in advance!

bye, julian

P.S. it seems using texture2D() in a loop produces no more artifacts on NVIDIA on Win32 with drivers > 190.x. can anyone confirm or deny?

I’d suggest scaling the gradients by 2^(-bias).

In order to get a specific LoD, use texture2DLod(). In that case you don’t need the gradients.

thanks i a lot i’ll try this now…

dividing the gradients by this value works fine.

thanks!

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.