Stereo images and 3D glasses

Stereo buffers which are meant to be used to generate stereoscopic images are not supported by most OpenGL implementations for windows.
A common method to generate stereo images without the use of stereo buffers is to use interlaced left and right images with an interlaced video mode.

Yet, there are now 3D glasses (Elsa revelator, Asus VR 3D) that are used in synchrony with rapidly switching non-interlaced images on the monitor.

How does the video card manage to follow a refresh rate of 100Hz+ without the use of stereo buffers while playing 3D power demanding games (or maybe it does use some kind of stereo buffers) ?

Also, almost any game can be played with those glasses, even if it has no built-in function to generate the left and right images. How are these images generated ? By a special driver ?
Also I think that additionnal transformations are needed to generate these images (translations & rotations).

Thx for your replies

That boggels me too. How can you get the stereo effect wiht only one framebuffer and no special code to generate the frame?

I think the concept is about the same as when making those stereographic pictures you saw lots of a few years back. Those that looked like black/white noice, untill you focused your eyes and an image appeared as “3D”. They was also produced from one single image.

But I think it some kind of special driver that deals with it. After reading the manual for my Asus board, the section about their 3D glasses add-on, you had to enable this feature. So I suppose it’s a driver issue.

Anyways, doing it with just one image will surely cause a parallax effect when moving your head left/right relative your monitor (up/down aswell, but that sure looks silly in a third-persons view).

The real problem hardwarewise is, I think, not the videocard being able to deliver 100+Hz to the monitor, but rather the oposite. A modern card can do 120Hz, if not more, in 1280x1024. The problem is the monitor, my can do maximum 85Hz in above mentioned resolution. Monitors that can do 100+Hz in very high resolution is generally quite expensive.

Well, for the monitor, I guess that either you pay a lot of $$$ … sorry €€€ , or you have to cope with a 640x480@100Hz (and if your monitor can’t even do that, well … you’re kind of f***ed).

What I meant about the card being able to follow a refresh rate of 100+Hz was not that it might not be able to generate a video signal for this refresh rate (most cards can now do 120Hz - 140Hz maybe more) but that I was wondering how it could switch between two images at every new scan without the help of stereo buffers. In some games, even with the fastest gfx cards, performance is about 60 - 80fps, this means that if the card has to display 100+ frames in 1 sec, it has to render somewhere else than to the back buffer. That is having two buffers holding left and right images and then copying one of these images to the back buffer before switching at 100+Hz (and other buffers to render to at the same time). Not easy to handle this without proper stereo buffers or even auxiliary buffers.

All this makes me think that to achieve this 3D effect, the standard OGL implementation with only double-buffering must be somewhat modified…

But maybe, as this seems to be a driver issue it has nothing to do with OpenGL or D3D and is just some kind of emulation at the card’s level.

Does anyone have a clue ?

PS: Where are the nVidia guys when we need them ?