Quadro & stereo pixelformat

I have a problem setting up a stereo pixelformat on a Quadro Pro board.

When I use glut and ask for a stereo format this works ok. When I draw something in one buffer, I can clearly see the screen flickering (as I draw in only one buffer).

But When I set up my own window with win32 calls and ask for a stereo pixel format, I can’t seem to find one that works properly. When I check the pfd returned by ChoosePixelFormat, it is a stereo format. However, when I draw something, it doesn’t seem to be quadbuffered (no flicker) and, even worth, I get no hardware acceleration.

I downloaded NV Pixel format 1.0 to check the available pixel formats. The pfd identifiers it gives me are the ones I used to try and set up my window. If I try those PFs in NV Pixel Format, I get quadbuffering and HW acceleration, but when I use their identifiers to set up my window, I get neither quadbuffering nor HW acceleration.

There must be something I’m missing.
Any suggestions?

>>When I check the pfd returned by ChoosePixelFormat, it is a stereo format. However, when I draw something, it doesn’t seem to be quadbuffered (no flicker) and, even worth, I get no hardware acceleration.<<

That sounds mutually exclusive.
If you check glGetBooleanv(GL_STEREO, &isStereo); and it says true, you don’t have a Microsoft pixelformat and therefore the NVIDIA OpenGL driver running implying HW support. (You may check the dwFlags in the DescribePixelformat() call and glGetString(GL_VERSION), too.)

If you have a program which works in quadbuffered stereo, you do something wrong while selecting the pixelformat or while drawing.
Are you sure the GLUT example doesn’t flicker because it’s double buffered and uses PFD_SWAP_EXCHANGE (i.e. flips buffers duriong SwapBuffers())?

As test use a single buffered stereo pixelformat first.
Don’t draw into GL_FRONT, that will draw into both buffers and you won’t see flickering.
Draw something different into GL_FRONT_LEFT and GL_FRONT_RIGHT and it should be obvious if stereo switches works.
Also make sure you selected the correct stereo mode in the advanced control panel you can access when you selected the quadbuffered stereo option in the OpenGL tab of the display control panel.

What is your stereo HW (shutter glasses e.g.) doing? Does it flicker when you opened a stereo window?

Originally posted by Relic:
[b]>>When I check the pfd returned by ChoosePixelFormat, it is a stereo format. However, when I draw something, it doesn’t seem to be quadbuffered (no flicker) and, even worth, I get no hardware acceleration.<<

That sounds mutually exclusive.
If you check glGetBooleanv(GL_STEREO, &isStereo); and it says true, you don’t have a Microsoft pixelformat and therefore the NVIDIA OpenGL driver running implying HW support. (You may check the dwFlags in the DescribePixelformat() call and glGetString(GL_VERSION), too.)[/b]

I haven’t thought of it this way, but that sounds pretty right.
However, I look for stereo formats and it returns formats from 13 to 16 as being stereo, and that is exactly what NV Pixel Format gives me too (you know, that’s the nvidia tools that checks all your available formats). So I imagined I would get the flicker effect if I drew something using one of those PF identifiers, but I don’t + it’s real slow (like no HW acceleration).


If you have a program which works in quadbuffered stereo, you do something wrong while selecting the pixelformat or while drawing.
Are you sure the GLUT example doesn’t flicker because it’s double buffered and uses PFD_SWAP_EXCHANGE (i.e. flips buffers duriong SwapBuffers())?

I’m sure about that because I programed this glut program to test stereo (it’s a spinning cube-like, a slightly more evolved type of game than the Quake-likes ) and stereo works with glut (tested with glasses, the effect works ok).


As test use a single buffered stereo pixelformat first.
Don’t draw into GL_FRONT, that will draw into both buffers and you won’t see flickering.
Draw something different into GL_FRONT_LEFT and GL_FRONT_RIGHT and it should be obvious if stereo switches works.

I shall try that


Also make sure you selected the correct stereo mode in the advanced control panel you can access when you selected the quadbuffered stereo option in the OpenGL tab of the display control panel.

I did check the “Activate quadbuffering” checkbox (well, mine says “Activer l’API stereo à quatre tampons”, don’t know what it should be in english). Is there something else to do?


What is your stereo HW (shutter glasses e.g.) doing? Does it flicker when you opened a stereo window?

Homebrew hardware (yea, we’re not rich but we’ve got ideas). My stereo glasses are 3DMAX glasses. My homemade controller is triggered by the VSync signal, so it always flickers … even if there is no stereo window.
Is that bad? No. Works perfectly with my glut example and with the test window in NV Pixel Format.

Thanks for your help, I’m gonna try what you suggested.

Stupid me, in my program I have a text console that displays information about the renderer and it is the Quadro that’s doing the rendering … but it is slower than software with a standard pixelformat (no stereo).
So I guess I am doing something wrong, but what could that be? I have an "accelerated format that’s supposed to be stereo, I never change the render target (so I guess it is the back-left buffer), and I get no flicker but a ridiculous framerate.

Any idea?

The default glDrawBuffer for a double buffered stereo pixelformat is GL_BACK which is LEFT and RIGHT simultaneously.
Try if setting GL_BACK_LEFT explicitly helps.

Otherwise there are some examples using Win32 directly. http://www.stereographics.com/html/developers.html

Maybe you’ll find one which works and modify the source according to your needs.

The default glDrawBuffer for a double buffered stereo pixelformat is GL_BACK which is LEFT and RIGHT simultaneously.
Try if setting GL_BACK_LEFT explicitly helps.

You’re right, that was it.
I was so bothered by the fact it did not flicker that I didn’t even think that would solve my problem.
Thanks again.