aliasing effects in gpu-based raycaster

Hi,

We have implemented a gpu-based single-pass raycaster to render volumes.

We are getting these aliasing effects (moire patterns) on the volume and have tried a lot of things but havent been able to get rid of these effects.

Here is a screenshot. http://userpages.umbc.edu/~alark1/iron_protein.jpg

Any advice/suggestions will be appreciated.

Thanks,

Alark

Your edges get brighter when you hit a spot with low density?
Does it get better with increased resolution?
If not, is there a lighting calculation done per pixel?
For normal calculation, do you read the density data axis aligned in texture space or axis aligned in view space?
Do it in texture space. Read in the center of the voxels, use six lookups with nearest filtering and deduct the normal. Don’t forget to normalize it.

One way is to improve the sampling characteristics; pre-integration would work well. A simpler approach is to break up the regular sampling that leads to Moire patterns. Instead of just using the first hit on the bounding box as the first sample point, jitter it by some random fraction of the sampling distance along the ray (and then increment the position by the sampling distance as usual). That should result in a noisier image but at much lower frequencies.

Aliasing is usually an effect of not sampling at a rate high enough to reproduce a signal. The sampling rate necessary for volume rendering not only depends on the size of your voxels, i.e. the highest frequencies in your volume data, but also on your color and alpha-lookup tables. If you use luts (so called transfer functions) with high frequencies you have to sample at a much higher rate. For example, to render a surface – like you did – you have a lookup table that contains a step function for the alpha channel, which results in an infinite frequency in your mapped volume data. So you basically would have to sample extremely high to get nice looking images.