How do I draw into the back buffer using Win32 calls?

Hi,

I have seen this working with Direct3D and have been trying to find a way to do this with OpenGL.

Is this possible? and if so how?

Thanks
Damian

It’s just as easy as Direct3D @damian

Direct3D:

g_lpd3dDevice->Clear()
g_lpd3dDevice->BeginScene();
//draw stuff
g_lpd3dDevice->EndScene();
g_lpd3dDevice->Present(NULL, NULL, NULL, NULL);

For OpenGL:

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();

//draw vertices

glFlush();
SwapBuffers( g_hDC ); 

As long as your OpenGL Device Context is created properly, you’re golden.

hth,

Err… I’m guessing you want to know how to draw into the backbuffer using Win32 GDI calls right? Sorry I dunno whether that is possible, actually I don’t even think Direct3D allows you to do that. I’m sure you can lock a DirectDraw surface for GDI calls, but that’s not the same of course…

:confused:

Thanks for the replies.

Yes I want to draw into the back buffer using win32 GDI calls.

I have come to the conclusion after a lot of reading and experiments that this is not possible.

I’ve also tried using pbuffers and frame buffer objects in an attempt to do double buffering myself. However the hit on performance is too high.

It is however possible in DirectX. So sadly I’m going to have to go down this route on Windows.

Regards
Damian

Having come back to this topic after some time doing less interesting things I have found that it is possible to emulate the effect of drawing GDI and Xlib with OpenGL.

The effect on X11 is exactly the same as with DirectX on Windows. But the effect on Win32 is slightly different as you have to have 32bit pixmap/bitmap and GDI is not capable of doing alpha correctly with patterned fills of raster.

Basically the approach is to draw into a pixmap/bitmap and either load as a texture (texture from pixmap) or blend using buffers. It is a bit complex to describe in a post but it does all work and I get the original effect I wanted.

Regards
Damian

you can use GDI functions to draw into OpenGL backbuffer. it will be neither accelerated nor supported by Vista. wouldn’t do it.
http://www.opengl.org/pipeline/article/vol003_7/

Sorry but you have missed the point.

If you know what you are doing you can get Accelerated OpenGL drawing and do GDI/X11 drawing and you can do this on Windows/Vista and X11.

Though the X11 drawing works better.

What you can not do is interleave the calls. You have to do the OpenGL then do the GDI drawing then swap the buffers.

The GDI drawing is done to either a bitmap/pixmap. The bitmap needs alpha setup correctly (this is where X11 works better as GDI does not handle alpha very well for patterned fills).

You load the texture (either use glTexSubImage2D or if you have ARB_pixel_buffer_object use it along with glTexSubImage2D) and basically draw over the top of the previously rendered OpenGL with a large quad with this texture and with blending enabled.

Depending on how fast you want things to run you could draw into the bitmap in a secondary thread. If you want to push this even further then upload the texture in the thread or have two textures and stagger their use to ensure the GPU/CPU upload does not occur when you need the texture. I prefer to upload the textures in the main thread as I have found that this is more reliable across more OGL drivers, I just load the textures well before I need them.

The impact on frame rate using this approach is at least a halving. The main two hits are GDI/X11 drawing and upload of the texture.

On my target system I am getting more then 100Hz.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.