PDA

View Full Version : How to generate stereo manually



melomania
12-31-2009, 06:48 AM
Hi
I started learning OpenGL. I worked myself thru the "Beginning OpenGL Game Programming second edition" book.
I think it is a great source for learning OpenGL 3.

My goal is to generate a 3D world which is rendered in stereo.
I have the nVidia GTS250 and the 3D Vision shutter glasses.

3D Vision is capable of displaying scenes in stereo while they are not developped that way. So it probably overrides the (mono) rendering and makes the second view (stereo) automatically for you.

I would like to develop some application that does stereo rendering manually. So that I have complete control over it.

So I quess I have to use quad buffer mode. But how can I make this work on the nVidia shutter glasses? How can I make the shutter glasses sync to the monitor? I do not want nVidia makes this stereo automatically, so should I remove the nVidia 3D Vision driver stuff?

Do I also have to tune my grafics card using RivaTuner? Or is this card capable of doing the quad buffer mode?

I would be very happy if some one could show me a program example how to do that. Prefereable without using Glut.

Dark Photon
12-31-2009, 07:38 AM
My goal is to generate a 3D world which is rendered in stereo. I have the nVidia GTS250 and the 3D Vision shutter glasses. ... So I quess I have to use quad buffer mode.
Historically, I think quad-buffered stereo has been Quadro-only. In the NVidia driver README, it's pretty explicit about quad-buffered stereo being Quadro only. So if the docs are correct, that shouldn't work on your GTS250.

However, while I thought I'd heard and read that the true 3D shutter glasses stereo solution (aka "NVidia 3D Vision", in contrast to "NVidia 3D Vision Discover" which is just cheap red/blue anaglyph glasses), was locked to Quadros only. For instance:

* 3D Vision Overview (NVidia) (http://www.nvidia.com/object/3D_Vision_Overview.html)

However, here:

* NVidia GTS250 (http://www.nvidia.com/object/product_geforce_gts_250_us.html)

it says it supports "full NVidia 3D Vision", whatever marketing decided that means.

So who knows. I'm not impressed with how unclear the information is out there on NVidia's 3D Vision support.

If you don't get some great info here, I suggest you try http://forums.nvidia.com or http://developer.nvidia.com/forums. (http://developer.nvidia.com/forums) The former has an explicit forum for the 3D Vision stuff (NVIDIA Forums > nZone > Hardware > GeForce 3D Vision).

melomania
12-31-2009, 09:00 AM
The 3D Vision shutter glasses do work with the GS250 card. No problem.
But the question is if I can use the quad buffer mechanisme when programming manually using OpenGL.
Or is that for Quadro cards only?

Dark Photon
12-31-2009, 10:04 AM
But the question is if I can use the quad buffer mechanisme when programming manually using OpenGL.
Or is that for Quadro cards only?
From the README for the recent NVidia driver I have installed here on Linux:


Option "Stereo" "integer"

Enable offering of quad-buffered stereo visuals on Quadro.
...
Stereo is only available on Quadro cards.
But given how inconsistent their 3D Vision/stereo info is, I suggest you ask NVidia on their forums.

Simon Arbon
12-31-2009, 06:54 PM
Historically, I think quad-buffered stereo has been Quadro-only.
I used to have an ASUS GeForce 3 that was quad buffered sterio and came with LCD shutter glasses.
It suffered from ghost images due to the LCD not going fully black fast enough, and the lenses were heavily tinted to make the flickering less noticable, but that just made everything too dark.
The technology at the time just wasn't good enough so they stopped making the 3D versions of their GeForce cards.

Quad buffered sterio is part of the OpenGL standard and is very simple to impliment (Just switch between 4 different buffer pointers at VSYNC instead of 2) so the only reason for it not to work on a GTS250 is if the NVIDIA marketing department decided it wanted to force people who want 3D to buy a more expensive card.

The ASUS card had a socket for the 3D glasses on the backplate next to the VGA socket, but before that there were homebrew devices controlled from the computers parallel port that seemed to work with any graphics card.

Simon Arbon
12-31-2009, 09:41 PM
I quess I have to use quad buffer mode. But how can I make this work on the nVidia shutter glasses? How can I make the shutter glasses sync to the monitor? I do not want nVidia makes this stereo automatically, so should I remove the nVidia 3D Vision driver stuff?
I would be very happy if some one could show me a program example how to do that. Prefereable without using Glut. The standard NVIDIA driver does not offer stereo pixel formats, so you probably need to keep the 3D vision driver.

Do the following to find out if stereo is supported in OpenGL(assuming your using windows):
Setup a PixelFormatDescriptor (http://msdn.microsoft.com/en-us/library/dd368826(VS.85).aspx) structure, setting the PFD_STEREO flag in addition to the usual OpenGL flags as follows:
{#NOTE# This is in pascal, but you should be able to convert it to whatever language you are using, or just set the individual fields in your code}

var
PFD: PixelFormatDescriptor = (
nSize: sizeof(PixelFormatDescriptor); {Size}
nVersion: 1; {version}
dwFlags: PFD_Draw_To_Window + PFD_Support_OpenGL + PFD_DoubleBuffer + PFD_STEREO; {Flags}
iPixelType: PFD_Type_RGBA; {Color Mode}
cColorBits: 32; {Color buffer}
cDepthBits: 24 ); {Depth buffer}

In the code we now ask windows to find the closest matching pixelformat (ChoosePixelFormat (http://msdn.microsoft.com/en-us/library/dd318284(VS.85).aspx)), set it (SetPixelFormat (http://msdn.microsoft.com/en-us/library/dd369049(VS.85).aspx)), then use DescribePixelFormat (http://msdn.microsoft.com/en-us/library/dd318302(VS.85).aspx) to see what it actually gave us:

PixelFormatID := ChoosePixelFormat( DeviceContext, @PFD );
if PixelFormatID = 0 then RaiseLastOSError;
if not SetPixelFormat( DeviceContext, PixelFormatID, @PFD ) then RaiseLastOSError;
MaxPixFormat := DescribePixelFormat( DeviceContext, PixelFormatID, SizeOf(PIXELFORMATDESCRIPTOR), PFD );

If the PFD_STEREO bit is still set then you have a quad buffered framebuffer and can call glDrawBuffer(BACK_LEFT) to draw the left eye image and glDrawBuffer(BACK_RIGHT) to draw the right eye image.

melomania
01-01-2010, 09:12 AM
It seems that I can set the PFD_STEREO flag. After the ChoosePixelFormat and SetPixelFormat calls, the flag is still set.
But when I call glDrawBuffer(GL_BACK_RIGHT) I get an GL_INVALID_OPERATION error code.
When calling glDrawBuffer(GL_BACK_LEFT) I get no error code.

The complete render code is:\


void Example::render()
{
GLuint error = GL_NO_ERROR;

error = glGetError();
glDrawBuffer(GL_BACK_LEFT);
error = glGetError();

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();

glRotatef(2*m_rotationAngle, 0, 0, 1);

glBegin(GL_TRIANGLES);
glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
glVertex3f(-1.0f, -0.5f, -4.0f);
glColor4f(1.0f, 1.0f, 0.0f, 1.0f);
glVertex3f(1.0f, -0.5f, -4.0f);
glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
glVertex3f(0.0f, 0.5f, -4.0f);
glEnd();

error = glGetError();
glDrawBuffer(GL_BACK_RIGHT);
error = glGetError();

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();

glRotatef(m_rotationAngle, 0, 0, 1);

glBegin(GL_TRIANGLES);
glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
glVertex3f(-1.0f, -0.5f, -4.0f);
glColor4f(1.0f, 1.0f, 0.0f, 1.0f);
glVertex3f(1.0f, -0.5f, -4.0f);
glColor4f(0.0f, 0.0f, 1.0f, 1.0f);
glVertex3f(0.0f, 0.5f, -4.0f);
glEnd();

}


I try to draw a triangle for the left buffer and then an other one (different angle) for the right one. But I always only get the second one on screen.

dukey
01-01-2010, 02:29 PM
quad buffer is essentially dead since support for it was killed in vista

melomania
01-01-2010, 04:40 PM
quad buffer is essentially dead since support for it was killed in vista


dead????
Do you mean that the nVidia Vista driver implementation of OpenGL does not support it? And is this the same for Windows 7?
But nVidia sells this 3D Vision package.

Simon Arbon
01-01-2010, 11:32 PM
After the ChoosePixelFormat and SetPixelFormat calls, the flag is still set.
Neither of these write to the PFD structure, you must use DescribePixelFormat before checking the PFD to see the flags (and other settings) of the pixel format that ChoosePixelFormat returned.

Are you using full-screen or a window? The NVIDIA site says that 3D is currently only supported for full-screen contexts.
Do you have the latest versions of both drivers? The early versions were difficult to get working.
Have you tried different NVIDIA control-panel settings for 3D?

NVIDIA 3D is supposed to be supported on XP, Vista and 7.
It is however possible that they have deliberately disabled direct access to the 3D buffers by OpenGL programs on the GeForce cards for marketing reasons (so they can sell more of the expensive Quadro's).
A quick search of NVIDIA's site found several articles on what does and doesn't work, each of which completely disagreed with what any of the others said.

melomania
01-02-2010, 09:14 AM
The call glGetIntegerv(GL_STEREO, &isStereo) results in isStereo being false.
So I guess I'm out of luck. I also tried full screen. Doesn't work either.

I bought the 3D Vision from nVidia with the assumption I can create 3D myself. But this seems not the case.
I can not efford an expensive Quadro card.

The RivaTuner tool also does not support the latest Windows 7 driver. So that is not a solution too.

:(

Does anyone know if I can create 3D with an ATI card?

Ilian Dinev
01-02-2010, 09:49 AM
With the vuzix VR920 that I have, I just have to call an API from a provided DLL to mark frames. Maybe there's something similar exposed in the driver (and the DLLs that come with it), even if it'll be hackish. Look it up with Depends.exe .
I'd look into DX-land, where there could be examples.

Ah, also there are the "iZ3D"
http://www.iz3d.com/licenses

The only way to have 3D on ATi is via iZ3D or VR920's drivers.

Simon Arbon
01-02-2010, 05:50 PM
One other possibility is to simulate quadbuffering by drawing the left and right views in alternate frames.
Use wglSwapInterval(1); to ensure you are locked to VSYNC.
Draw Left view, SwapBuffers, draw right view, swapbuffers, repeat.

Now you just need to convince the 3D vision glasses to switch views at each VSYNC pulse even though the driver thinks its drawing 2D.
See if there is an option in the control panel to force the 3D mode on.
If not then the only options left are some sort of software or hardware hack.

You will also need to accurately time each frame as any skipped frames will swap the left/right views, and some way in your application (ie. a hotkey) to swap them if it starts the wrong way around.

Alfonse Reinheart
01-03-2010, 12:31 AM
Don't these glasses have drivers that are supposed to take an ordinary 3D scene, complete with depth buffer, and fake quad-buffer stereo, without the application having to be written for it? As I understood it, that was the idea behind the thing.

melomania
01-03-2010, 05:59 AM
Don't these glasses have drivers that are supposed to take an ordinary 3D scene, complete with depth buffer, and fake quad-buffer stereo, without the application having to be written for it? As I understood it, that was the idea behind the thing.

I think internally quad buffer is used and the frames are rendered to two buffers at the same time with a little camera view offset. But the quad buffers options seems to be disabled for the OpenGL API. Marketing issue I guess.

Dark Photon
01-03-2010, 07:56 PM
quad buffer is essentially dead since support for it was killed in vista
Oh please. Killed in the OS that hardly anyone "downgraded" to?

ZbuffeR
01-04-2010, 07:28 AM
quad buffer is essentially dead since support for it was killed in vista
Oh please. Killed in the OS that hardly anyone "downgraded" to?

Nvidia says quad buffer stereo works on vista/win7 :
http://www.nvidia.com/object/quadro_pro_graphics_boards.html
Anyone with actual experience on this ?

def
01-04-2010, 08:53 AM
No actual experience, but shuttered stereo definitely does not need quad buffers.
'Shuttering' means, we are displaying 2 images after each other and we have to cover the left and right eye at the right time so only the correct image is seen.
This technology is nothing new, it only works better these days because there are 120Hz LCD Displays on the market now, which you need to get the NVision Glasses to work.

Theoretically, if the glasses start shuttering automatically, you only need to render the frames alternating and synced to the display. Without special drivers you might need to add a functionality to switch the dominance from left to right. (just displaying one frame twice to switch left and right images).

There always have been 3rd party shutterglasses that were able to sync to a VGA signal...

I hope somebody can confirm this with the 3D Vision glases so I can get a set myself. :D

ZbuffeR
01-04-2010, 09:06 AM
you only need to render the frames alternating and synced to the display
And what do you do when rendering a frame takes longer than 1/60th of a second ? With quad buffered stereo, the gpu continues to display previous left and right images until a new set is ready, whatever the time needed to do so.

melomania
01-04-2010, 04:00 PM
you only need to render the frames alternating and synced to the display
And what do you do when rendering a frame takes longer than 1/60th of a second ? With quad buffered stereo, the gpu continues to display previous left and right images until a new set is ready, whatever the time needed to do so.

Then you probably end up with reversed left and right pictures.
I think quad buffering is a must.
But ....
I would like to generate the left and right images for different moments in time.
So I would like to render the left picture, than swap this buffer (not swapping the right picture). Than render the right picture and swap that one (not swapping left). This to overcome motion problems.

Is that possible with quad buffering?

def
01-05-2010, 02:43 AM
And what do you do when rendering a frame takes longer than 1/60th of a second ? With quad buffered stereo, the gpu continues to display previous left and right images until a new set is ready, whatever the time needed to do so.

That's obviously a plus for having a quadro card. Which makes me wonder how Nvidia handles that problem in their 3d vision driver...
You basically need to know your framerate BEFORE you render.

I assume the technical details are not going to be revealed by Nvidia.

An idea: You could render both eyes into textures and decide after rendering which one should be displayed before swap. This way a switch of left to right might be avoided.

ZbuffeR
01-05-2010, 02:53 AM
This is not complex for the card, whether it is a quadro or a consumer geforce. The driver has the hardest part in consumer 3D vision mode, as all drawing commands between 2 swapbuffers must be stored, then duplicated, then each projection has to be tweaked respectively to the left and to the right. This step is the most problematic, often nvidia advises to disable ingame shadows to prevent problems... INSTEAD OF JUST PROVIDING THE CORRECT API TO GAME PROGRAMMERS, WHICH ALREADY EXISTS AS QUAD BUFFERED STEREO !!!!! grmbl, sorry.

Then both command batches are send to the card, one drawing to FBO1, the other to FBO2, then the driver goes on with next user supplied frame. In parallel the card loop the display of FBO1 then FBO2 after each vsync, in sync with shutter glasses left and right eyes (using interrupts).

One big problem is the black box delay added by LCD screens, wich prevent some users from seeing 3D...

Alfonse Reinheart
01-05-2010, 03:12 AM
The driver has the hardest part in consumer 3D vision mode, as all drawing commands between 2 swapbuffers must be stored, then duplicated, then each projection has to be tweaked respectively to the left and to the right. This step is the most problematic, often nvidia advises to disable ingame shadows to prevent problems...

I always imagined that they took the framebuffer and depth buffer, and used those to reconstruct something not entirely unlike the actual scene from a slightly different perspective. Shift the depth values over a bit, using parallax to move some pixels more than others. That kind of thing.

I can't imagine how they could develop an algorithm that can, in 100% of cases, figure out what the "projection" in a vertex shader is, and then modify it for the other eye.


INSTEAD OF JUST PROVIDING THE CORRECT API TO GAME PROGRAMMERS, WHICH ALREADY EXISTS AS QUAD BUFFERED STEREO !!!!! grmbl, sorry.

Even if they did, it would mean nothing to already existing applications. You can't sell a product based on what a few games may provide in the future; that's what sunk physics processors (among other things).

That being said, it would be nice if they could provide this for new applications. So that backwards compatibility can be provided via this hack, while developers that actually want to support this can do it the right way.

ZbuffeR
01-05-2010, 03:30 AM
I can't imagine how they could develop an algorithm that can, in 100% of cases, figure out what the "projection" in a vertex shader is, and then modify it for the other eye.
Well it works in much less 100% of cases, and new games take some time before a driver patch can make it work well enough with 3d vision. And it often comes down to disable shadows, or switch off some multipass effect to be playable.
Extracting left and right from depth would just leave holes around a tree for example.


Even if they did, it would mean nothing to already existing applications. You can't sell a product based on what a few games may provide in the future; that's what sunk physics processors (among other things).
Hell they already have TWIMTBP that among other things certifies a correct operation with NV 3D vision. Not much more complex to handle stereo from the ground up, as done in Quake 3. At least give this possiblity ...

There is a big difference with physics accelerators, because it was extra hardware to buy. 3d stereo is just a patch away, so the barier to entry is much lower. And it does not touch the game mechanics in any way, contrary to physics.

dukey
01-05-2010, 11:49 AM
quad buffer is essentially dead since support for it was killed in vista


dead????
Do you mean that the nVidia Vista driver implementation of OpenGL does not support it? And is this the same for Windows 7?
But nVidia sells this 3D Vision package.


Yeah it's dead. Even with a quadro card you can't set a quad buffer pixel format in Vista/7. You can set one fullscreen, but you can only shutter with it, which is fairly useless for most people. The nvidia driver just generates stereo automatically for you in the driver. It may or may not work correctly. There is some guide of things you should not do if you want their driver to work with your prog.

Dark Photon
01-05-2010, 12:49 PM
Yeah it's dead. Even with a quadro card you can't set a quad buffer pixel format in Vista/7.
What if you disable that silly Aero thingy.

In Linux, there's a hot-key for that.

melomania
01-05-2010, 02:18 PM
Now the GS250 was not going to work with quad buffer I searched for an effordable Quadro card.
I managed to get an old FX4600 :)
I now can indeed render to a quad buffer!
The resulting view shows two pictures overlaying each other.

But the 3D Vision shutter glasses does not switch on.
I unstalled the old 3D Vision software and installed the Quadro version. Doesn't work.
When checking the display settings I can enable 3D and select a 3D display type. Like 'generic display with IR emitter' or '3D DLP' (with and without IR emitter). But there is no '3D Vision' option to select.

Can anybody give me a hint what the problem could be?

ZbuffeR
01-05-2010, 03:33 PM
Did you read this page ?
http://www.nvidia.com/object/quadro_pro_graphics_boards.html
How is connected the IR emitter to the sync system ?


EDIT: and windowed stereo should work on Vista too, since recent drivers :
http://forums.nvidia.com/index.php?showtopic=95729&st=20&p=557838&#entry557 838

melomania
01-05-2010, 04:44 PM
Did you read this page ?
http://www.nvidia.com/object/quadro_pro_graphics_boards.html
How is connected the IR emitter to the sync system ?


EDIT: and windowed stereo should work on Vista too, since recent drivers :
http://forums.nvidia.com/index.php?showtopic=95729&st=20&p=557838&#entry557 838

Yes I did read that.
I do not have this Din cable.
I was assuming it should work with the USB connection and you can improve synchronisation using the Din cable.
But maybe I'm wrong.
Here in Europe jou do not get the Din cable with it.
I could easily make one if I know the connection diagram.
But I can not find that information.

ZbuffeR
01-05-2010, 06:55 PM
Just read the whole topic above, found this : http://forums.nvidia.com/index.php?showtopic=98067
Both how to build the cable and how to ask Nvidia support to get you one (not clear if paid or not...)

melomania
01-06-2010, 09:41 AM
Thanks

I ordered the connectors. In a few days I will be ready to try it with cable.

But shouldn't the glasses work with the USB emitter without the extra Din cable? Or is the extra cable really needed for a Quadro?

I'm asking because I'm afraind something else is wrong.

ZbuffeR
01-06-2010, 10:15 AM
Well those are questions for nvidia...
But according to their forums, yes, it is compulsory for quad buffered stereo.

melomania
01-06-2010, 01:02 PM
Ok, talked with nVidia support.
I had to uninstall and remove eveything from nVidia and start installing everyting one by one. Booting several times inbetween.

Now it works. I do not know why, because I did it before. But probably I did something different (booted one time more?).

I also have the 'Stereoscopic settings' tab with the test application again, which I had with the GS250 but not with the FX4600 before.

And OpenGL quad buffers works :), even in a window (not fullscreen). This without the extra mini Din cable!

I'm happy now and can start experimenting with 3D stuff!!

Thank you all for the hints, suggestions and feedback.

matchStickMan
01-07-2010, 02:24 AM
Hi guys,

I'm trying to create a 3D app also. So I'll just do a recap of the things that you said to make it work:

1. Add the PFD_STEREO for the pixelFormat structure.

2. Draw one instance with
glDrawBuffer(GL_BACK_LEFT);
and another one with
glDrawBuffer(GL_BACK_RIGHT);

3. Don't use GTS250 ( I also tried on that one and my app crashed with GL_BACK_RIGHT) and use a high-end quadro (i got a Quadro FX 3700 ) card instead.

I'm currently in the process of installing that card and I'll let you know what happens next.


I'm writing this to confirm with you if I'm going in the right direction.


One funny thing, when I tried the Nvidia demo 3D video (with the nvidia player) on the GTS250, it worked fine. And the Nvidia website says that GTS250 is 3D vision ready.

ref: http://www.nvidia.com/object/3D_Vision_Requirements.html

So if that card does not support Quad buffering, how do they achieve the 3D effects? They also give a list of games that are compatible with those cards.


I'm asking this because I want to know if there's another way of creating stereoscopic vision without quad buffering.

Do let me know.
Tnx.

ZbuffeR
01-07-2010, 04:32 AM
matchStickMan, did you actually read this this thread ?
Nvidia 3D vision do not provide an API to do the stereo yourself, it guesses it for the game, as opposed to quad buffered stereo.

Dark Photon
01-07-2010, 06:17 AM
Nvidia 3D vision do not provide an API to do the stereo yourself, it guess it for the game, as opposed to quad buffered stereo.
Right. More on that here:

* GDC09-3DVision-The_In_and_Out.pdf (http://developer.download.nvidia.com/presentations/2009/GDC/GDC09-3DVision-The_In_and_Out.pdf) (see pgs. 36-43)
* Nvidia StereoAPI problem (http://www.mtbs3d.com/phpBB/viewtopic.php?f=60&t=4026) (search for NvAPI)
* 3DVision_Develop_Design_Play_in_3D_Stereo.pdf (SIGGRAPH '09) (http://developer.download.nvidia.com/presentations/2009/SIGGRAPH/3DVision_Develop_Design_Play_in_3D_Stereo.pdf) (don't bother - even more vague than the GDC stuff)
* NVISION08: Easy immersion with NVIDIA 3D Stereo (http://developer.nvidia.com/object/nvision08-stereo.html)
* NvAPI (Home Page) (http://developer.nvidia.com/object/nvapi.html)

From the first, you can see there's this NvAPI Stereo layer on top of D3D. From the last, you can download NvAPI. Inside you'll find a header and a help file, NVAPI_Reference_Developer.chm, documenting the 3D stereo module APIs (use kchmviewer or chmsee on Linux).

Also from the presentations you do get the jist that this isn't as simple as quad-buffer stereo. It essentially states that it is behind-the-scenes driver magic. Witness the shared cull frustum for both eyes, vertex shaders auto-modified by the NVidia driver, the driver using the w coordinate to render to left/right eye views. I've done stereo and multi-frustum displays before (the underlying concepts are simple), but frankly I came out of their SIGGRAPH presentation confused as to what they were doing or how I'd code for it if I ever wanted to.

melomania
01-07-2010, 02:35 PM
I'm currently in the process of installing that card and I'll let you know what happens next.


It seems to be very important to remove the old nVidia grafphics card driver and 3D Vision driver as good as possible.
Thus uninstall drivers, remove applications. Reboot.
Install new card driver. Reboot. Make shure it is installed correctly. then install 3D Vision. The nVidia Control Panel should now have a page where you can launch the test application. It that works, I think you are ok.

matchStickMan
01-08-2010, 12:53 AM
Hey tnx Dark Photon, those links were very helpful.

I've managed to setup the 3D stereoscopic view from the nVidia Control panel.

my app is still crashing though. I'll just keep debugging it.

BoeroBoy
08-04-2010, 01:04 PM
I too got a 3D setup for my NV GTX 260 with hopes of testing my OpenGL applications in stereo... Somewhere the marketers who decided to support only D3D stereo forgot to tell the advertisers and the vendors.

Had I seen "OpenGL not supported... at least not without GLDirect" I wouldn't have wasted the money. This is the second time nVidia has disappointed on a big ticket research item for me. First was when I got a quadro card years ago to use overlays only to hear that Vista would be eliminating all overlay support.

How can they release products with such potential and just cripple them with ludicrous driver limitations? Full screen D3D only? My app targets mainstream for peeps without a $700 Quadro. Now they'll have to figure out anaglyph glasses with messed up color. Thanks a lot, nVidia... I've liked them for their Linux support but I think I'll take another look at ATI in the future.

FYI for some of you, GL Direct (Direct 3D -> OpenGL wrapper) may be a possible option.

http://sourceforge.net/projects/gldirect/

Robulus
08-18-2010, 07:57 AM
How does one set fullscreen mode? That has to be done in the CreateWindow() call? I seem unable to confirm stereo mode. :(



pfd.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_GDI | PFD_DOUBLEBUFFER | PFD_SUPPORT_OPENGL | PFD_STEREO;

int iPixelFormat = ChoosePixelFormat(hdc, &pfd);

BOOL bSuccess = SetPixelFormat(hdc, iPixelFormat, &pfd);

iPixelFormat = GetPixelFormat(hdc);

DescribePixelFormat(hdc, iPixelFormat, sizeof(PIXELFORMATDESCRIPTOR), &pfd);

if ((pfd.dwFlags & PFD_STEREO) == 0)
MessageBoxError("NO STEREO!!!", "OpenGL");
else
MessageBoxInfo("STEREO YES!!!", "OpenGL");

def
09-20-2010, 02:29 PM
I just got the Nvidia 3D vision glasses and I did some experimenting. I couldn't get a GeForce card to work in stereo.

So I installed a Quadro FX 580 (no frame sync connector) and it works with the NVidia 3d Vision setup...

...without using quad buffering!!! Just
glDrawBuffer(GL_BACK);
Running an application at 120Hz the frames get displayed alternating between left and right eye, just how it should be possible without a Quadro card... Of course you have to check for correct sync and frame performance yourself.
(running in windowed or fullscreen mode)

So this setup comes as close to "manually controlling" OpenGL stereo as possible.
Now we only need to find out how to hack the NVidia drivers...
Setting the window to stereo seems enough to activate the IR transmitter, only problem, on Geforce cards we get "pixelformat with necessary capablitilies not found."

I used GLUT (lazy me) for my testing with:

glutInitDisplayMode( GLUT_DOUBLE | GLUT_RGBA | GLUT STEREO);

Any comments or ideas?

ZbuffeR
09-20-2010, 02:42 PM
Well, it has already been confirmed that GLUT_STEREO with a Quadro and Nvidia 3d vision works correctly with GL_BACK_LEFT and GL_BACK_RIGHT.

Not sure what you want to do with just GL_BACK ?
I must be missing you point ...

def
09-21-2010, 11:38 AM
My Point?
Maybe only that there would be no effort involved for NVidia to enable this behavior on GeForce cards as well.
( just being able to switch that IR transmitter on via the driver settings would be nice. )

No quad buffering or creating a window with stereo pixelformat needed!

ZbuffeR
09-21-2010, 02:52 PM
You say that as if it was a good thing. It is not.

No quad buffer means your application can not control the stereo parameters. Bad.

def
09-22-2010, 12:58 AM
Not "bad" in all contexts!
When you are used to writing applications with fixed framerates and you need the fastest and newest cards it's either spending too much money on quadro cards or bitch about defunct GeForce OpenGL drivers...

ZbuffeR
09-22-2010, 05:40 AM
Please keep us updated about your progress.

I, too, would love to see correct stereo on non-quadro card, I am just pessimistic about it.

Chris Lux
09-23-2010, 12:27 AM
I've done stereo and multi-frustum displays before (the underlying concepts are simple), but frankly I came out of their SIGGRAPH presentation confused as to what they were doing or how I'd code for it if I ever wanted to.
same here. it more and more puzzles me why they not just open up QBS for GeForces. This would make their live easier and ours too. There is just too much guessing involved with the driver-magic-hacks which frankly is doomed to fail from the start.


Please keep us updated about your progress.

I, too, would love to see correct stereo on non-quadro card, I am just pessimistic about it.
dito. :p