3D Vision and GL 3.x +

I’m working on GL 3.x based engine, and, with all the 3D stuff going around, I wanted to make sure it will be compatible.

From searching these forums and reading Nvidia developer papers I’m confused. Maybe I missed something.

The general problem with 3D Vision approach seems that the Nvidia driver is simply patching vertex shaders on fly, w coords and Projection matrices in them???.

But the GL 3.x doesn’t have such thing as glVertex, because everything is transferred as generic vertex attributes. And matrices are also fully custom built without API functions which driver could wrap.

Does it mean we won’t be able to use stereo with 3D Vision?

Should I even bother with square/nonsquare rendertargets and stuff like that for “free” 3D or do I need to prepare to so 3D the old fashioned way and hope for some API calls to sync buffer swaps with glasses?

>> The general problem with 3D Vision approach seems that the Nvidia driver is simply patching vertex shaders on fly, w coords and Projection matrices in them???.

Haven’t delved very deeply myself but I don’t think that’s going to get you there. I think you need to sheer the projection to the left and right such that the resulting frustra converge + coincide at the focal plane.

Brolingstanz you are wrong, the problem with 3D vision is that IT CAN NOT BE CONTROLLED inside the application, contrary to quad-buffered stereo or good old anaglyph.

And yes it is a pain.

In a thought experiment, I imagined a gerbil treadmill powering two electrodes attached to my left and right eyes, stimulating an alternating blink response at regular intervals over the course of several hours, all the while being subjected to a battery of visuals composed entirely of opponent colors ^1.

^1 Opponent colors are those that normally can’t be seen together at once, specifically reddish green and yellowish blue.

if you want to support stereo rendering in your engine, then the most appropriate way in opengl is to use a quad buffer stereo pixel format. this will give you all the control you need, and will work as well for other vendors using a standard interface.

many applications are using QBS today, although usually very few games.

Thank you Pierre but can you name which consumer cards actually provide a quad buffer stereo pixel format ?

Just to give my two cents:

I sincerely hope that a extremely popular OpenGL game comes out (Rage?) so that Nvidia is either forced to enable OpenGL for 3D Vision in a way it is useful for other games and applications or the much more preferred solution is for Nvidia and ATi to open up the quad buffer API to GeForce and Radeon models.

If you think about it, there is a way to let the developers make the right decisions for correct stereoscopic rendering, but Nvidia does all these assumptions about render target sized and so on which almost never work correctly.

So i think with stereoscopic rendering more and more penetrating the consumer market the quad buffer API has to be opened!

well not really, as very few games use OpenGL. If quad buffering’s ever added to d3d, then we might see the quad buffered GL pixel formats available on geforces. Shortly followed by the end of the Quadro line of cards, I would imagine - as it’s the only thing left they can do that the consumer cards can’t (apart from the hacky gsync stuff, which has a very limited market).
I agree it’s an awful waste. You need properly skewed frustums to do stereo properly - sheering is just wrong. I just don’t see the point in buying a 3D Vision set up at the moment.

I think there’s also some 30bpp X screens and render target support (10/10/10/2), SDI outputs (HDR GPU video outputs), and no doubt some other things I don’t know about.

<not speaking for amd>
given the recent changes in the industry (many new monitors supporting stereo natively, hdmi 1.4), QBS might not be a workstation feature for long anymore. especially considering that it is indirectly available on d3d.
</not speaking for amd>

if a company wants to develop a QBS game, I would recomment that it contacts the corresponding IHV to ask for support.

No really “in development”, but for reference, there is at least 1 game I verified having Quad Buffer Stereo support :
Doom 3

And I am pretty sure Quake 3 arena too, but that brings us far in the past.

Doom 3 was the last Id game to use OpenGL.
That was a long time ago.

I really think it was a mistake for nvidia to release 3D Vision without exposing quad buffers to the programmer. Now yet another generation of gamers has had their fingers burnt by a crap 3D implementation. They’ll struggle to sell a new solution in the near future.

I have to admit that chunking up my firends in “3.5D” would be very satisfying…

Still, I don’t need QBS to give biker and associates a taste of my railgun o’ unholy whoopass - that never gets old.

Doom3 with the latest patches seems not to support quad buffer stereo, even quake3s implementation seems wrong and does not give a good depth impression.

I hope one IHV will step up and release consumer quad buffer enabled drivers, this will enable correct off axis projections for perfect stereoscopic rendering. Think about the possibilities using consumer head tracking like TrackIR in combination with stereo… exciting :wink:

quake3 : try upping the cg_stereoSeparation to 4.0 instead of the default 0.4 I had great results in the past, getting plasma/lightning in the face was really freaky.
The hud was not great in 3D however, better to switch it off:
seta r_stereo “1”
seta cg_stereoSeparation “4.0”
seta r_drawstatus “0”
seta cg_drawAmmoWarning “0”
seta cg_drawAttacker “0”
seta cg_drawFPS “0”
seta cg_drawcrosshairnames “0”
seta cg_drawkiller “0”
And do not forget to lower the fov, like around 70.

Or use ioquake3, with more control about the stereo projections, it supports both QBS and anaglyph support for us poor consumers :
http://wiki.ioquake3.org/Stereo_Rendering

About head tracking, yes it would be a great addition, but what I saw about trackir-like setups was more like new controller rather than actual frustum positionning.

While I think that not releasing all the necessary tools for development was very bad move, making 3D work in a hacked way in pretty much all DirectX games is actually better than what PhysX did, real hardware, no games.

And $400 for 3D vision setup (including 22" screen) is not that bad actually.

But given the fact that a lot of OpenGL applications use custom matrix stuff and parameter passing, that’s kinda no go for us it seems.

So for now, all I can do is to accommodate to quad buffer approach and hope for the best? [do Nvidia/Ati people still attend these forums]

I understand I need a Quadro to do the testing?

[quote="M/\dm/
"]
While I think that not releasing all the necessary tools for development was very bad move, making 3D work in a hacked way in pretty much all DirectX games is actually better than what PhysX did, real hardware, no games.
[/quote]

I agree, you can count the games on a single hand that work without problems. Most of them have serious issues like shadows in screen space or post processing effects based on a single eye etc.

And $400 for 3D vision setup (including 22" screen) is not that bad actually.

But given the fact that a lot of OpenGL applications use custom matrix stuff and parameter passing, that’s kinda no go for us it seems.

So for now, all I can do is to accommodate to quad buffer approach and hope for the best? [do Nvidia/Ati people still attend these forums]

I understand I need a Quadro to do the testing?

Yes you need a Quadro, I got to do some testing here at work using a FX5800 on my 22" 3D Vision enabled display. It works great. But me being from Europe i had to solder up a 3D sync cable to connect the emitter to the Quadro because Nvidia decided that the european market does not need the cable and they could save some money on it while selling the 3D Vision set for the same figure price in Euro as they take Dollars :p.

Aside the criticism: i really think modern renderers can not be hacked to do correct off-axis projection stereo from the outside. The developer needs the tools to make the righ t decisions (what to render when to where). There is so much literature out there about how to do it right that they (ATi, Nvidia, Intel etc.) even do not need to produce something themselves besides enable quad buffer stereo to consumer level products.

This definitely is one thing OpenGL currently has over D3D, they have an API.

Maybe we should take this to the future OpenGL proposal threads, to make something like this required core, but we all know how this will end ;).

Maybe we should take this to the future OpenGL proposal threads, to make something like this required core

To make what “required core”? That quad-buffer stereo is available on all implementations of a particular version? Even if no goggles or whatever are actually installed?

No thank you. I’d much rather use the existing interface to ask if quad-buffer stereo is available (that is, the presence of QBS pixel formats indicates goggles), and use it where possible (and where the user decides).

I confirm that you need a Quadro, we bought one FX3700 only for testing the 3D Vision support with OpenGL, quite expensive test…

Anyway, the good news is that the 3D sync cable is not mandatory, our setup is working fine without it, just leaving the IR emitter connected in USB.

The bad news is that the 3D Vision drivers for Quadro are pretty lame and not so robust. We had a hell of a time trying to have a proper stable setup on Win7 x64 without success, while we had no problem on XP32…

Add my vote for proper support for QBS on consumer level cards, the DirectX hack is not a way to go to create anything of value in stereoscopic rendering/visualization.

Cheers

I do not see your problem. How is this any different than anti aliasing or anisotropic filtering? If the hardware supports it and the user chooses to enable it in the video settings of his game/application it is usable if he chooses not to use it, fine. That’s how it is working today with Quadros and QBS and that’s exactly what I am asking for in consumer level drivers.