How to generate stereo manually

Hi
I started learning OpenGL. I worked myself thru the “Beginning OpenGL Game Programming second edition” book.
I think it is a great source for learning OpenGL 3.

My goal is to generate a 3D world which is rendered in stereo.
I have the nVidia GTS250 and the 3D Vision shutter glasses.

3D Vision is capable of displaying scenes in stereo while they are not developped that way. So it probably overrides the (mono) rendering and makes the second view (stereo) automatically for you.

I would like to develop some application that does stereo rendering manually. So that I have complete control over it.

So I quess I have to use quad buffer mode. But how can I make this work on the nVidia shutter glasses? How can I make the shutter glasses sync to the monitor? I do not want nVidia makes this stereo automatically, so should I remove the nVidia 3D Vision driver stuff?

Do I also have to tune my grafics card using RivaTuner? Or is this card capable of doing the quad buffer mode?

I would be very happy if some one could show me a program example how to do that. Prefereable without using Glut.

Historically, I think quad-buffered stereo has been Quadro-only. In the NVidia driver README, it’s pretty explicit about quad-buffered stereo being Quadro only. So if the docs are correct, that shouldn’t work on your GTS250.

However, while I thought I’d heard and read that the true 3D shutter glasses stereo solution (aka “NVidia 3D Vision”, in contrast to “NVidia 3D Vision Discover which is just cheap red/blue anaglyph glasses), was locked to Quadros only. For instance:

However, here:

it says it supports “full NVidia 3D Vision”, whatever marketing decided that means.

So who knows. I’m not impressed with how unclear the information is out there on NVidia’s 3D Vision support.

If you don’t get some great info here, I suggest you try http://forums.nvidia.com or http://developer.nvidia.com/forums. The former has an explicit forum for the 3D Vision stuff (NVIDIA Forums > nZone > Hardware > GeForce 3D Vision).

The 3D Vision shutter glasses do work with the GS250 card. No problem.
But the question is if I can use the quad buffer mechanisme when programming manually using OpenGL.
Or is that for Quadro cards only?

From the README for the recent NVidia driver I have installed here on Linux:

Option “Stereo” “integer”

Enable offering of quad-buffered stereo visuals on Quadro.


Stereo is only available on Quadro cards.

But given how inconsistent their 3D Vision/stereo info is, I suggest you ask NVidia on their forums.

I used to have an ASUS GeForce 3 that was quad buffered sterio and came with LCD shutter glasses.
It suffered from ghost images due to the LCD not going fully black fast enough, and the lenses were heavily tinted to make the flickering less noticable, but that just made everything too dark.
The technology at the time just wasn’t good enough so they stopped making the 3D versions of their GeForce cards.

Quad buffered sterio is part of the OpenGL standard and is very simple to impliment (Just switch between 4 different buffer pointers at VSYNC instead of 2) so the only reason for it not to work on a GTS250 is if the NVIDIA marketing department decided it wanted to force people who want 3D to buy a more expensive card.

The ASUS card had a socket for the 3D glasses on the backplate next to the VGA socket, but before that there were homebrew devices controlled from the computers parallel port that seemed to work with any graphics card.

The standard NVIDIA driver does not offer stereo pixel formats, so you probably need to keep the 3D vision driver.

Do the following to find out if stereo is supported in OpenGL(assuming your using windows):
Setup a PixelFormatDescriptor structure, setting the PFD_STEREO flag in addition to the usual OpenGL flags as follows:
{#NOTE# This is in pascal, but you should be able to convert it to whatever language you are using, or just set the individual fields in your code}

var
  PFD: PixelFormatDescriptor = (
    nSize: sizeof(PixelFormatDescriptor); {Size}
    nVersion: 1;  {version}
    dwFlags: PFD_Draw_To_Window + PFD_Support_OpenGL + PFD_DoubleBuffer + PFD_STEREO; {Flags}
    iPixelType: PFD_Type_RGBA; {Color Mode}
    cColorBits: 32; {Color buffer}
    cDepthBits: 24 ); {Depth buffer}

In the code we now ask windows to find the closest matching pixelformat (ChoosePixelFormat), set it (SetPixelFormat), then use DescribePixelFormat to see what it actually gave us:

  PixelFormatID := ChoosePixelFormat( DeviceContext, @PFD ); 
  if PixelFormatID = 0 then RaiseLastOSError;                
  if not SetPixelFormat( DeviceContext, PixelFormatID, @PFD ) then RaiseLastOSError;
  MaxPixFormat := DescribePixelFormat( DeviceContext, PixelFormatID, SizeOf(PIXELFORMATDESCRIPTOR), PFD );

If the PFD_STEREO bit is still set then you have a quad buffered framebuffer and can call glDrawBuffer(BACK_LEFT) to draw the left eye image and glDrawBuffer(BACK_RIGHT) to draw the right eye image.

It seems that I can set the PFD_STEREO flag. After the ChoosePixelFormat and SetPixelFormat calls, the flag is still set.
But when I call glDrawBuffer(GL_BACK_RIGHT) I get an GL_INVALID_OPERATION error code.
When calling glDrawBuffer(GL_BACK_LEFT) I get no error code.

The complete render code is:\


void Example::render()
{
	GLuint error = GL_NO_ERROR;

	error = glGetError();
	glDrawBuffer(GL_BACK_LEFT);
	error = glGetError();

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glLoadIdentity();

    glRotatef(2*m_rotationAngle, 0, 0, 1);

	glBegin(GL_TRIANGLES);
		glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
	    glVertex3f(-1.0f, -0.5f, -4.0f);
		glColor4f(1.0f, 1.0f, 0.0f, 1.0f);
		glVertex3f(1.0f, -0.5f, -4.0f);
		glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
		glVertex3f(0.0f,  0.5f, -4.0f);
	glEnd();

	error = glGetError();
	glDrawBuffer(GL_BACK_RIGHT);
	error = glGetError();

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glLoadIdentity();

    glRotatef(m_rotationAngle, 0, 0, 1);

	glBegin(GL_TRIANGLES);
		glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
	    glVertex3f(-1.0f, -0.5f, -4.0f);
		glColor4f(1.0f, 1.0f, 0.0f, 1.0f);
		glVertex3f(1.0f, -0.5f, -4.0f);
		glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
		glVertex3f(0.0f,  0.5f, -4.0f);
	glEnd();

}

I try to draw a triangle for the left buffer and then an other one (different angle) for the right one. But I always only get the second one on screen.

quad buffer is essentially dead since support for it was killed in vista

dead???
Do you mean that the nVidia Vista driver implementation of OpenGL does not support it? And is this the same for Windows 7?
But nVidia sells this 3D Vision package.

After the ChoosePixelFormat and SetPixelFormat calls, the flag is still set.

Neither of these write to the PFD structure, you must use DescribePixelFormat before checking the PFD to see the flags (and other settings) of the pixel format that ChoosePixelFormat returned.

Are you using full-screen or a window? The NVIDIA site says that 3D is currently only supported for full-screen contexts.
Do you have the latest versions of both drivers? The early versions were difficult to get working.
Have you tried different NVIDIA control-panel settings for 3D?

NVIDIA 3D is supposed to be supported on XP, Vista and 7.
It is however possible that they have deliberately disabled direct access to the 3D buffers by OpenGL programs on the GeForce cards for marketing reasons (so they can sell more of the expensive Quadro’s).
A quick search of NVIDIA’s site found several articles on what does and doesn’t work, each of which completely disagreed with what any of the others said.

The call glGetIntegerv(GL_STEREO, &isStereo) results in isStereo being false.
So I guess I’m out of luck. I also tried full screen. Doesn’t work either.

I bought the 3D Vision from nVidia with the assumption I can create 3D myself. But this seems not the case.
I can not efford an expensive Quadro card.

The RivaTuner tool also does not support the latest Windows 7 driver. So that is not a solution too.

:frowning:

Does anyone know if I can create 3D with an ATI card?

With the vuzix VR920 that I have, I just have to call an API from a provided DLL to mark frames. Maybe there’s something similar exposed in the driver (and the DLLs that come with it), even if it’ll be hackish. Look it up with Depends.exe .
I’d look into DX-land, where there could be examples.

Ah, also there are the “iZ3D”
http://www.iz3d.com/licenses

The only way to have 3D on ATi is via iZ3D or VR920’s drivers.

One other possibility is to simulate quadbuffering by drawing the left and right views in alternate frames.
Use wglSwapInterval(1); to ensure you are locked to VSYNC.
Draw Left view, SwapBuffers, draw right view, swapbuffers, repeat.

Now you just need to convince the 3D vision glasses to switch views at each VSYNC pulse even though the driver thinks its drawing 2D.
See if there is an option in the control panel to force the 3D mode on.
If not then the only options left are some sort of software or hardware hack.

You will also need to accurately time each frame as any skipped frames will swap the left/right views, and some way in your application (ie. a hotkey) to swap them if it starts the wrong way around.

Don’t these glasses have drivers that are supposed to take an ordinary 3D scene, complete with depth buffer, and fake quad-buffer stereo, without the application having to be written for it? As I understood it, that was the idea behind the thing.

I think internally quad buffer is used and the frames are rendered to two buffers at the same time with a little camera view offset. But the quad buffers options seems to be disabled for the OpenGL API. Marketing issue I guess.

Oh please. Killed in the OS that hardly anyone “downgraded” to?

Oh please. Killed in the OS that hardly anyone “downgraded” to?
[/QUOTE]
Nvidia says quad buffer stereo works on vista/win7 :
http://www.nvidia.com/object/quadro_pro_graphics_boards.html
Anyone with actual experience on this ?

No actual experience, but shuttered stereo definitely does not need quad buffers.
‘Shuttering’ means, we are displaying 2 images after each other and we have to cover the left and right eye at the right time so only the correct image is seen.
This technology is nothing new, it only works better these days because there are 120Hz LCD Displays on the market now, which you need to get the NVision Glasses to work.

Theoretically, if the glasses start shuttering automatically, you only need to render the frames alternating and synced to the display. Without special drivers you might need to add a functionality to switch the dominance from left to right. (just displaying one frame twice to switch left and right images).

There always have been 3rd party shutterglasses that were able to sync to a VGA signal…

I hope somebody can confirm this with the 3D Vision glases so I can get a set myself. :smiley:

you only need to render the frames alternating and synced to the display

And what do you do when rendering a frame takes longer than 1/60th of a second ? With quad buffered stereo, the gpu continues to display previous left and right images until a new set is ready, whatever the time needed to do so.

Then you probably end up with reversed left and right pictures.
I think quad buffering is a must.
But …
I would like to generate the left and right images for different moments in time.
So I would like to render the left picture, than swap this buffer (not swapping the right picture). Than render the right picture and swap that one (not swapping left). This to overcome motion problems.

Is that possible with quad buffering?