Panoramic view using cube map

Hello, everybody!

I work with Open Inventor 6.1 and am trying to generate panoramic view of a virtually generated environment. I want to display 360 degrees in the horizontal and 180 in the vertical plane and do it on a single screen. So the image would look strange and confusing, but it has its advantages.

I think the only way to do it is to have 6 cameras looking in different directions to cover the whole panorama. But there are two possible ways to arrange them.

  1. I can have 3 cameras with 120 degree horizontal FOV and 90 degree vertical FOV, looking at -45 degree to the horizon and a second set of cameras that look at +45 degrees to the horizon (something like 2 tetrahedrons glued to each other). The problem is that in this case the frusta (or frustums) overlap so I cannot get a coherent picture.

  2. I can have 4 cameras in the horizontal plane looking in perpendicular directions, one camera looking up and one looking down. All cameras have 90x90 degree FOV. In this way I can easily see everything and can create a cube map of the environment. The problem is how to display it on a single screen to have one nice rectangular picture.
    Something like this: (see the bottom section)
    http://paulbourke.net/miscellaneous/cube2cyl/

I believe the second approach is better. Now I can produce 6 viewports to display what the six cameras see. But I cannot make them into a single picture. I think it is possible to generate a cube map texture and then render it on a rectangle, covering the whole screen but it seems to be very inefficient in terms of performance.

So if anybody has an idea how to reach the final result using either method, or if you can suggest another method, please, let me know.

Thank you very much in advance!

P.S. I am working on Windows XP 32bit, Open Inventor 6.1, Microsoft Visual Studio 2005.

Indeed second approach is much better.
Can you write and run your own fragment shaders on Open Inventor ?
Then it is “only” a matter of implementing the math in the shader to backproject xy screen coordinates to xyz vector, and use this a texture coordinate in the cubemap. No tesselation is needed, only a full screen quad.

I have some Fragment shader, which does the colouring of the terrain. I am not really familiar with fragment shaders (I develop a software, that was originally created by someone else). I thought I would need to change the vertex shader - something like find the coordinates of every point of the cube and project them on a plane rectangle somehow, but I do not know the maths.

Furthermore, my scenegraph is compiled into a .dll, which is used by another program to display the virtual environment. So the whole scenegrpah goes through a vertex and fragment shader and I do not know how to take the quad out of the shader (otherwise it will be distorted by the shader).

Go to Humus - Textures and download the cubemap viewer (at the top of the page)
There is also a source code.

I found this app quite useful. It could serve as a reference if nothing else.

Thabks for the advice.

The app is nice, however it creates a continuous image which is displayed only partially. And I need to display all the images on a single screen to get something like a heavily distorted map projection.