wglDescribePixelFormat function description

The result from the code

SDL_VideoModeOK( application_window_width, application_window_height, application_window_bpp, sdl_surface_flags );

according to glIntercept is

wglChoosePixelFormatARB(0x101098b,0x12e548,0x12e534,1,0x12e52c,0x12e530)
----->wglDescribePixelFormat(0x101098b,1,0,0x0000)=126 glGetError() =GL_INVALID_OPERATION
----->wglDescribePixelFormat(0x101098b,1,0,0x0000)=126 glGetError() =GL_INVALID_OPERATION
----->wglDescribePixelFormat(0x101098b,1,0,0x0000)=126 glGetError() =GL_INVALID_OPERATION
----->wglDescribePixelFormat(0x101098b,1,0,0x0000)=126 glGetError() =GL_INVALID_OPERATION
etc

So it looks like theres an error there (I assume the 0x0000)
but anyways I cant seem to find any info and what are the parameters to pass in the wglDescribePixelFormat function.
So Im wondering does anyone have a link to where this is documented?

ta zed

wglDescribePixelFormat(hdc, iPixelFormat, sizeof( pfd ), &pfd);

The first parameter is the GDI device context
The second parameter is the integer pixel format
The third parameter is the size of the passed PIXELFORMATDESCRIPTOR
The fourth parameter is a pointer to a PIXELFORMATDESCRIPTOR structure that will be filled with the description of the pixel format passed in iPixelFormat

Cheers.

So Im wondering does anyone have a link to where this is documented?

It isn’t. wglDescribePixelFormat is one of 5 WGL functions internal to the OpenGL32.dll. I’m guessing the parameters mirror those of DescribePixelFormat, but there’s no guarantee of that. Passing NULL in the fourth parameter is acceptable.

It should also be noted that, if they haven’t created a valid render context yet, glGetError doesn’t mean anything.

thanks guys, I suppose its possible that there isnt a valid context when this above code is called (I havent delved fully into what the SDL code does )

Im also getting the messages

ExtensionFunction::AddFunction - Function glBindTexture does not match previous lookup (multiple OpenGL devices?)
ExtensionFunction::AddFunction - Function glGenTextures does not match previous lookup (multiple OpenGL devices?)
ExtensionFunction::AddFunction - Function glPopClientAttrib does not match previous lookup (multiple OpenGL devices?)
ExtensionFunction::AddFunction - Function glPushClientAttrib does not match previous lookup (multiple OpenGL devices?)
ExtensionFunction::AddFunction - Function glTexSubImage2D does not match previous lookup (multiple OpenGL devices?)

similar to this thread
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=263276#Post263276

the thing is I dont have a single glPopClientAttrib or glPushClientAttrib in my code, so Ive no idea whats happening there (I assume SDL is calling them somewhere)

I suppose its possible that there isnt a valid context when this above code is called (I havent delved fully into what the SDL code does )

not likely as heres what a radeon reports (Its going wglMakeCurrent anyways)


wglChoosePixelFormat(0xaa0117ef,0x187024)=2 
wglSetPixelFormat(0xaa0117ef,2,0x187024)=true 
wglCreateContext(0xaa0117ef)
----->wglGetPixelFormat(0xaa0117ef)=2 
----->wglDescribePixelFormat(0xaa0117ef,2,40,0x12ea14)=60 
----->wglGetPixelFormat(0xaa0117ef)=2 =0x10000 
wglMakeCurrent(0xaa0117ef,0x10000)
----->wglGetPixelFormat(0xaa0117ef)=2 
----->wglGetPixelFormat(0xaa0117ef)=2 =true 
wglGetProcAddress("wglGetExtensionsStringARB")=0x100299f0 
wglGetExtensionsStringARB(0xaa0117ef)="WGL_ARB_extensions_string..." 
wglGetProcAddress("wglChoosePixelFormatARB")=0x10029a50 
wglChoosePixelFormatARB(0xaa0117ef,0x12ebc0,0x12ebac,1,0x12eba4,0x12eba8)
----->wglDescribePixelFormat(0xaa0117ef,1,40,0x0000)=60 
----->wglDescribePixelFormat(0xaa0117ef,25,40,0x12e9d0)=60 
----->wglDescribePixelFormat(0xaa0117ef,26,40,0x12e9d0)=60 
----->wglDescribePixelFormat(0xaa0117ef,27,40,0x12e9d0)=60 
----->wglDescribePixelFormat(0xaa0117ef,28,40,0x12e9d0)=60
etc

and a nvidia


wglChoosePixelFormat(0xa10114ea,0x3a4354)
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 =10 
wglSetPixelFormat(0xa10114ea,10,0x3a4354)
----->wglDescribePixelFormat(0xa10114ea,10,40,0x51fff4c)=126 =true 
wglCreateContext(0xa10114ea)
----->wglGetPixelFormat(0xa10114ea)=10 
----->wglDescribePixelFormat(0xa10114ea,10,40,0x12ea5c)=126 
----->wglGetPixelFormat(0xa10114ea)=10 
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 =0x10000 
wglMakeCurrent(0xa10114ea,0x10000)
----->wglGetPixelFormat(0xa10114ea)=10 
----->wglGetPixelFormat(0xa10114ea)=10 
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126 =true 
wglGetProcAddress("wglGetExtensionsStringARB")=0x100299f0 
wglGetExtensionsStringARB(0xa10114ea)="WGL_ARB_buffer_region WGL..." 
wglGetProcAddress("wglChoosePixelFormatARB")=0x10029a50 
wglChoosePixelFormatARB(0xa10114ea,0x12ec00,0x12ebec,1,0x12ebe4,0x12ebe8)
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126  glGetError() =GL_INVALID_OPERATION
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126  glGetError() =GL_INVALID_OPERATION
----->wglDescribePixelFormat(0xa10114ea,1,0,0x0000)=126  glGetError() =GL_INVALID_OPERATION

Ill have to dig deeper, having the spec on why wglDescribePixelFormat throws GL_INVALID_OPERATION would be nice

As I pointed out before, WGL functions can run when GL contexts don’t exist. Which means that they cannot rely on GL state information, like errors. So if a WGL function is giving you a glError, feel free to ignore it. The function should not be providing glErrors.

yes thanks but as u see from the lines

wglCreateContext(0xa10114ea)
wglMakeCurrent(0xa10114ea,0x10000)

the GL context is created, so I dont know what the nvidia errors are coming from

oh well

the GL context is created

That doesn’t change the fact that none of these functions have any right to be doing anything to the glError state. Or any GL state, for that matter. GL functions affect GL state. WGL and GLX functions do not (outside of the initial setup).