WGL - Anybody Bother Passing PIXELFORMATDESCRIPTOR To SetPixelFormat()?

Hi Folks:

Thanks to posters in this forum, I’ve been happily rendering my first model in an application’s splash screen for several weeks.

I lifted the code to setup WGL from this tutorial.

Tonight I had occasion to look at the OpenGL setup code and noticed that I had initialized the PIXELFORMATDESCRIPTOR value for pre OpenGL 3 code, but not for the code run for newer OpenGL.

The code runs fine. It runs fine now that I’m passing NULL for the argument.

SetPixelFormat()'s documentation here says this about the PIXELFORMATDESCRIPTOR argument:

Pointer to a PIXELFORMATDESCRIPTOR structure that contains the logical pixel format specification. The system’s metafile component uses this structure to record the logical pixel format specification. The structure has no other effect upon the behavior of the SetPixelFormat function.

I don’t know enough about Windows or OpenGL to know the meaning of that statement. It does sound dismissive.

Does anybody do anything with this SetPixelFormat() argument for current OpenGL?

  Thanks
  Larry

Both FreeGLUT and GLFW pass the information to SetPixelFormat(). FreeGLUT just passes the same pointer it passed to ChoosePixelFormat(), while GLFW uses DescribePixelFormat() to populate the structure.

[QUOTE=larryl;1286906]Tonight I had occasion to look at the OpenGL setup code and noticed that I had initialized the PIXELFORMATDESCRIPTOR value for pre OpenGL 3 code, but not for the code run for newer OpenGL.

The code runs fine. It runs fine now that I’m passing NULL for the argument.

SetPixelFormat()'s documentation here says this about the PIXELFORMATDESCRIPTOR argument … Does anybody do anything with this SetPixelFormat() argument for current OpenGL?[/QUOTE]

Yes. Here on Win7, I’m providing a non-NULL (initialized structure pointer) argument to this parameter populated with the same pointer value just passed to ChoosePixelFormat()/DescribePixelFormat() and it’s working fine.

That said, the pixel format that’s actually used for app rendering is one queried via wglChoosePixelFormatARB(), not ChoosePixelFormat()/DescribePixelFormat().

More detail (in case that made no sense) : In OpenGL code written in the last 10-15 years targeting Windows (which uses WGL as the window system interface), what you’re likely to see is WGL_ARB_pixel_format used as an override for the under-capable Windows built-in ChoosePixelFormat() / DescribePixelFormat() calls. That is, something like:

  1. Determine the pixel format with ChoosePixelFormat() / DescribePixelFormat()
  2. If WGL_ARB_pixel_format is supported…

[INDENT=2]a) Create a dummy GL context and bind it
b) Query function pointers for wglGetPixelFormatARB() / wglChoosePixelFormatARB()
c) Determine the pixel format we really want with wglGetPixelFormat
ARB() / wglChoosePixelFormatARB().
d) Delete the dummy context
[/INDENT]
3) Now create the GL context our app will use, using the selected pixel format.

In other words, #1 is just dummy work needed to create the dummy GL context to query to the function pointers that you want to use to query/set the pixel format your app “really” wants to render in.

For more on this, see these pages in the OpenGL Wiki:

Thanks Guys:

I’ll go ahead and put valid stuff in the record.

  Larry

[QUOTE=larryl;1286929]Thanks Guys:

I’ll go ahead and put valid stuff in the record. [/QUOTE]

I wouldn’t sweat the contents much. In the case where the pixel format index comes from wglGetPixelFormat*ARB() / wglChoosePixelFormatARB(), there really is no appropriate PIXELFORMATDESCRIPTOR. So it makes sense that it’d be ignored. This section in the OpenGL wiki supports that: