RayTraced Sphere Perspective issue.

This problem is a little involved, but I was hoping it might catch the eye of some advanced GL programmer out there (I’m looking at you DarkPhoton).

I’ve got a shader which ray-traces spheres for a molecular visualizer I’ve written.
Shader Code:
https://github.com/toastedcrumpets/DynamO/blob/experimental/src/magnet/magnet/GL/shader/sphere.hpp

I’m very happy with the shader, it works by

  • Taking in a VBO of GL_POINTS
  • The geometry shader creates a camera-facing billboard centered at the location of the point with the width/height of the sphere diameter.
  • Then the fragment shader discards all fragments outside of the circle inside the billboard, and calculates a normal and gl_FragDepth for any fragment inside the circle using the equation of a sphere.

The problem is although I am now capable of rendering millions of perfect spheres, they’re not quite perfect.

As an example, the left image is ray-traced, and the right image is from a high resolution triangular mesh.

What I hope you can see is that the triangular mesh spheres are “fisheyed” by the projection. I need some way to add this perspective effect in the ray tracing of the spheres.

This is more important to me than you might think as I have played around with head tracking (wii-mote) and 3DTV displays and the ray-traced spheres don’t quite look right in 3D due to this issue.

For reference, I started with this resource on GLSL sphere ray tracing
http://morgan.leborgne.free.fr/Home.html

And to get anti-aliased ray-traced spheres I used the GL_ARB_sample_shading extension to force per-sample evaluation of the fragment shader.

That’s because you’re not raytracing. You’re creating a circle in window-space, which you then fill with shading to appear spherical.

More details, and how to ray-trace correctly, can be found here.

I have to say a big thank you for the link, the book is excellent and I wish I’d seen it before. I now know these things are called impostors. There’s also a lot of other clear examples of stuff I was never sure about so plenty of good reading for this Christmas for me.

Yeah, I knew it wasn’t ray-tracing but I was hoping someone had worked out the details of the ray trace for me (with the correct sizing/placement of the billboard too).

Thanks again,
Marcus

The geometry shader creates a camera-facing billboard centered at the location of the point with the width/height of the sphere diameter.

In what coordinate system is the output of your geometry shader?
Just thinking out loud: if you emit the vertices in eye space and then transform them with the projection matrix, you should end up with a billboard that is not perfectly square. If additionally your fragment shader can produce the correct shading for an ellipse that fills the billboard you should get what you want?

The geometry shader is in screen space as the projection happens in the vertex shader.

If you have an idea of the math involved for this I would give it a go. I think oversizing the billboard won’t be too costly though, as the discard and calculations are relatively cheap.

The geometry shader is in screen space as the projection happens in the vertex shader.

No, the geometry shader works in whatever space you want. And it’s highly unlikely that the GS works in screen space, when the tradition outputs of the vertex shader are in clip space. Screen space happens much later, in the rasterizer.

Note that the shaders on the page that I pointed you to size all of their billboards in camera space. They are then transformed into clip space via the standard projection matrix.

Sorry yes, I meant that in my shader the projection matrix is applied in the vertex shader. You’re right, i didn’t mean screen space but clip space.