Loading WGL_ARB_pixel_format without creating a dummy render context?

Is there any way to get the proc’s for WGL_ARB_pixel_format without first having to create an OpenGL render context using ChoosePixelFormat to get the pixel format first, then making a call to wglGetExtensionsString and then wglGetProcAddress to load the WGL procedures and then destroying the old render context (which had to be created to use wglGetProcAddress in the first place, but now has to be deleted so we can use a different pixel format) and then setting the pixel format we actually want using the WGL_ARB_pixel_format technique and then creating a new render context (because we had to select a new pixel format)?

I need to be able to choose a pixel format with multisampling, but having to create and destroy a temporary render context just to get the stupid extension to pick the REAL pixel format is EXTREMELY annoying.

  • If you call SetPixelFormat more than once for a window Win32 gets VERY angry at you :slight_smile:

I guess I’m really just looking for the simplest and most “proper” way to select a multisample pixel format in Windows? The way described above feels intrinsically ad hoc every time I think about it.

I’m afraid you have no real choice.

You have no choice. But wait! It gets worse! Because you can’t SetPixelFormat() more than once on a single window, you actually need to create a (probably hidden) window for the first context, that you then kill and throw away once you have your extension pointers.

Ugly, but it works.

Is there sample code around, which does this described initalization in a proper way?

I’ve tailored my init code for NVidia cards, but have from time to time problems on ATI cards; but without an ATI cards for debugging it is hard to fix that code! :frowning:

I gave the situation a little bit more thought, and here’s what I arrived at… You’ll notice there’s not a lot of error handling in the dummy window part. I don’t think it’s too crucial, if it fails, just assume the driver’s too stupid to support WGL_ARB_pixel_format :stuck_out_tongue:

int
L3D_Sys_GetFirstICD (const HDC hdc)
{
  int pix_fmt,
      max_pix_fmt;

  static PIXELFORMATDESCRIPTOR pfd;

  ZeroMemory                        (&pfd,   sizeof (PIXELFORMATDESCRIPTOR));
  max_pix_fmt = DescribePixelFormat (hdc, 1, sizeof (PIXELFORMATDESCRIPTOR), &pfd);

  // We need to find an ICD OpenGL pixel format, we don't care about any of the
  // other specifics...
  for (pix_fmt = 0; pix_fmt < max_pix_fmt; pix_fmt++) {
    ZeroMemory          (&pfd,         sizeof (PIXELFORMATDESCRIPTOR));
    DescribePixelFormat (hdc, pix_fmt, sizeof (PIXELFORMATDESCRIPTOR), &pfd);

    // Obviously we need to support OpenGL :)
    if (! (pfd.dwFlags & PFD_SUPPORT_OPENGL))
      continue;

    // ICD pixel formats aren't generic (GDI), nor are they generic accelerated (MCD)
    if (! ((pfd.dwFlags & PFD_GENERIC_FORMAT) ##OR## (UBB seems to get confused if I include two pipes?!)
           (pfd.dwFlags & PFD_GENERIC_ACCELERATED)) )
      return pix_fmt;
  }

  // No ICD Pixel Format Found?!
  return 0;
}

Basically this finds the first ICD pixel format offered (as opposed to asking ChoosePixelFormat for a format with certain credentials) and uses that for SetPixelFormat. We don’t care about anything except that the pixel format supports OpenGL and is ICD.

void
L3D_Init_PixelFormatWGL (void)
{
  HDC    hdc     = NULL;
  HGLRC  hrc     = NULL;
  HWND   hWnd    = NULL;
  

  PIXELFORMATDESCRIPTOR pfd;

  pfd.nSize    = sizeof (PIXELFORMATDESCRIPTOR);
  pfd.nVersion = 1;


  hWnd = CreateWindow ("STATIC", "Lyra 3D - Dummy Window",
                       WS_DISABLED, 0, 0, 0, 0, NULL, NULL,
                       NULL, NULL);
  hdc  = GetDC (hWnd);


  SetPixelFormat (hdc, L3D_Sys_GetFirstICD (hdc), &pfd);


  // If we fail, don't bother trying to initialize any
  // extensions...
  if (! ((hrc = wglCreateContext (hdc)) &&
                wglMakeCurrent   (hdc, hrc)) ) {
    DestroyWindow (hWnd);
    return;
  }


  // Extension Loading Goes Here
  l3dInitWGLExtensionsString ();
  l3dInitPixelFormat         ();


  wglMakeCurrent   (NULL, NULL);
  wglDeleteContext (hrc);

  DestroyWindow    (hWnd);
}

There’s not a lot of error handling going on here, but it seems sufficient enough to me? Replace l3dInit… with your own extension loading code. I have to initialize the extensions string extension first because my extension loading backend uses it when initializing other WGL extensions. I imagine that’s a common scenario for most engines.

You could in theory also initialize ALL of your extensions in this step, but I’m not too fond of that :slight_smile: