I suppose its possible that there isnt a valid context when this above code is called (I havent delved fully into what the SDL code does )
not likely as heres what a radeon reports (Its going wglMakeCurrent anyways)
wglChoosePixelFormat(0xaa0117ef,0x187024)=2
wglSetPixelFormat(0xaa0117ef,2,0x187024)=true
wglCreateContext(0xaa0117ef)
----->wglGetPixelFormat(0xaa0117ef)=2
----->wglDescribePixelFormat(0xaa0117ef,2,40,0x12ea14)=60
----->wglGetPixelFormat(0xaa0117ef)=2 =0x10000
wglMakeCurrent(0xaa0117ef,0x10000)
----->wglGetPixelFormat(0xaa0117ef)=2
----->wglGetPixelFormat(0xaa0117ef)=2 =true
wglGetProcAddress("wglGetExtensionsStringARB")=0x100299f0
wglGetExtensionsStringARB(0xaa0117ef)="WGL_ARB_extensions_string..."
wglGetProcAddress("wglChoosePixelFormatARB")=0x10029a50
wglChoosePixelFormatARB(0xaa0117ef,0x12ebc0,0x12ebac,1,0x12eba4,0x12eba8)
----->wglDescribePixelFormat(0xaa0117ef,1,40,0x0000)=60
----->wglDescribePixelFormat(0xaa0117ef,25,40,0x12e9d0)=60
----->wglDescribePixelFormat(0xaa0117ef,26,40,0x12e9d0)=60
----->wglDescribePixelFormat(0xaa0117ef,27,40,0x12e9d0)=60
----->wglDescribePixelFormat(0xaa0117ef,28,40,0x12e9d0)=60
etc
and a nvidia
wglChoosePixelFormat(0xa10114ea,0x3a4354)
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 =10
wglSetPixelFormat(0xa10114ea,10,0x3a4354)
----->wglDescribePixelFormat(0xa10114ea,10,40,0x51fff4c)=126 =true
wglCreateContext(0xa10114ea)
----->wglGetPixelFormat(0xa10114ea)=10
----->wglDescribePixelFormat(0xa10114ea,10,40,0x12ea5c)=126
----->wglGetPixelFormat(0xa10114ea)=10
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 =0x10000
wglMakeCurrent(0xa10114ea,0x10000)
----->wglGetPixelFormat(0xa10114ea)=10
----->wglGetPixelFormat(0xa10114ea)=10
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 =true
wglGetProcAddress("wglGetExtensionsStringARB")=0x100299f0
wglGetExtensionsStringARB(0xa10114ea)="WGL_ARB_buffer_region WGL..."
wglGetProcAddress("wglChoosePixelFormatARB")=0x10029a50
wglChoosePixelFormatARB(0xa10114ea,0x12ec00,0x12ebec,1,0x12ebe4,0x12ebe8)
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 glGetError() =GL_INVALID_OPERATION
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 glGetError() =GL_INVALID_OPERATION
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 glGetError() =GL_INVALID_OPERATION
Ill have to dig deeper, having the spec on why wglDescribePixelFormat throws GL_INVALID_OPERATION would be nice