How to use double buffer

Hello,
My app uses double bouffer for drawing to avoid flashes.
In my computer runs perfectly but if I move this to other computer, I only can see flashes.
Which is the flag or the property that I have to set to enable double buffer always using any graphics card?

Please help me!!!

It depends on what API you’re using to set-up your OpenGL window. With the Win32 API, you have to set the flag PFD_DOUBLEBUFFER in your PIXELFORMATDESCRIPTOR structure.
Have a look at NeHe’s tutorials.

sorry, but this just I do.

My trouble is why it works properly in my computer but not in others?

I supose that there is some flag or property I have to set.

Thanks.

I don’t know very much of openGL but can’t it be so that he is clearing his front buffer and that at fast computers the front buffer gets filled so fast again that you don’t see that it is cleared but at slow computer you do?

If he is in single buffer mode and vsync is disabled then the screen will flicker when redrawing the screen if the computer is fast. But if you write your code to select the double buffer mode as Moz suggests then it should work on all computers with decent graphics hardware. There is no other flag or property. If you are using GLUT then there is a simple setting in there too.
I think it is glutInitDisplayMode(GLUT_DOUBLE | all your other requests );

Are you using GLUT or not? What graphics hardware do you have on both computers? Are your graphics drivers up to date?

Hello,

my PIXELFORMATDESCRIPTOR structure is

PIXELFORMATDESCRIPTOR uPfd = {
sizeof(PIXELFORMATDESCRIPTOR), // Size of this structure
1, // Version of this structure
PFD_DRAW_TO_WINDOW // Draw to Window (not to bitmap)
PFD_SUPPORT_OPENGL | // Support OpenGL calls in window
PFD_DOUBLEBUFFER, // Double buffered
PFD_TYPE_RGBA, // RGBA Color mode
24, // Want 24bit color
0,0,0,0,0,0, // Not used to select mode
0,0, // Not used to select mode
0,0,0,0,0, // Not used to select mode
16, // Size of depth buffer
0, // Not used to select mode
0, // Not used to select mode
PFD_MAIN_PLANE, // Draw in main plane
0, // Not used to select mode
0,0,0 }; // Not used to select mode

My computer is a Pentium III 1GHz and I’ve changed my graphics card some times; I’d put a NVidia Quadro 2 MXR, a Matrox Millenium G200 and the chipset Intel 815. With all this cards the application works correctly.
But if I run my program in other computer that is like mine; it uses a PIII 1GHz and a matrox Millenium G200; but it doesn’t work.
I’m getting cracy.

Try this

//search for the pixel format
int pfindex = ChoosePixelFormat( hDC, &uPfd );

PIXELFORMATDESCRIPTOR pfd;
//get the details of this pixel format
DescribePixelFormat( hDC, pfindex, sizeof( PIXELFORMATDESCRIPTOR ), &pfd );

pfd will now contain the pixel format you are really using. Check and see if it support double buffering. I am thinking that you are getting a pixel format that doesnt support double buffering.

Oh yes, It’s the answer.
Now it works properly.
I’ve changed the pixel format to one that supports double buffer.

Thank you very much indeed.

Best Regards.