I gave the situation a little bit more thought, and here’s what I arrived at… You’ll notice there’s not a lot of error handling in the dummy window part. I don’t think it’s too crucial, if it fails, just assume the driver’s too stupid to support WGL_ARB_pixel_format
int
L3D_Sys_GetFirstICD (const HDC hdc)
{
int pix_fmt,
max_pix_fmt;
static PIXELFORMATDESCRIPTOR pfd;
ZeroMemory (&pfd, sizeof (PIXELFORMATDESCRIPTOR));
max_pix_fmt = DescribePixelFormat (hdc, 1, sizeof (PIXELFORMATDESCRIPTOR), &pfd);
// We need to find an ICD OpenGL pixel format, we don't care about any of the
// other specifics...
for (pix_fmt = 0; pix_fmt < max_pix_fmt; pix_fmt++) {
ZeroMemory (&pfd, sizeof (PIXELFORMATDESCRIPTOR));
DescribePixelFormat (hdc, pix_fmt, sizeof (PIXELFORMATDESCRIPTOR), &pfd);
// Obviously we need to support OpenGL :)
if (! (pfd.dwFlags & PFD_SUPPORT_OPENGL))
continue;
// ICD pixel formats aren't generic (GDI), nor are they generic accelerated (MCD)
if (! ((pfd.dwFlags & PFD_GENERIC_FORMAT) ##OR## (UBB seems to get confused if I include two pipes?!)
(pfd.dwFlags & PFD_GENERIC_ACCELERATED)) )
return pix_fmt;
}
// No ICD Pixel Format Found?!
return 0;
}
Basically this finds the first ICD pixel format offered (as opposed to asking ChoosePixelFormat for a format with certain credentials) and uses that for SetPixelFormat. We don’t care about anything except that the pixel format supports OpenGL and is ICD.
void
L3D_Init_PixelFormatWGL (void)
{
HDC hdc = NULL;
HGLRC hrc = NULL;
HWND hWnd = NULL;
PIXELFORMATDESCRIPTOR pfd;
pfd.nSize = sizeof (PIXELFORMATDESCRIPTOR);
pfd.nVersion = 1;
hWnd = CreateWindow ("STATIC", "Lyra 3D - Dummy Window",
WS_DISABLED, 0, 0, 0, 0, NULL, NULL,
NULL, NULL);
hdc = GetDC (hWnd);
SetPixelFormat (hdc, L3D_Sys_GetFirstICD (hdc), &pfd);
// If we fail, don't bother trying to initialize any
// extensions...
if (! ((hrc = wglCreateContext (hdc)) &&
wglMakeCurrent (hdc, hrc)) ) {
DestroyWindow (hWnd);
return;
}
// Extension Loading Goes Here
l3dInitWGLExtensionsString ();
l3dInitPixelFormat ();
wglMakeCurrent (NULL, NULL);
wglDeleteContext (hrc);
DestroyWindow (hWnd);
}
There’s not a lot of error handling going on here, but it seems sufficient enough to me? Replace l3dInit… with your own extension loading code. I have to initialize the extensions string extension first because my extension loading backend uses it when initializing other WGL extensions. I imagine that’s a common scenario for most engines.
You could in theory also initialize ALL of your extensions in this step, but I’m not too fond of that