Some strangness in 3.3 context creation

Hello, everyone.

Recently i tried to understand the process of creation OpenGL 3.3 context. So if i understand it right, i need to create a dummy window, set it’s pixel format, create a dummy context, fetch the function addresses (WGL), create new window and context with them, destroy the dummy context, destroy the old window.
Is this the correct steps?
If so, then there must something that i’m missing. I’ve attached a small sample (no dependencies, just the VS2010 project).
What i don’t understand is that this sample works correctly (it just outputs the glGetString() info in the console) as it is. But if i check it under gDebugger or GLIntercept, they both show an error right after the first glGetString, which is not captured by the app itself (there is a check in it). More then that, gDebugger shows an access vialotion. I’ve tested this in ATI AMD 5400 (mobility) and ATI AMD 6870. I’ve tested already 3 driver versions starting from the newest one.

If someone could please take a look at this, it would be greatly appreciated.

Thank you.

Just in case, this is what GLIntercept 1.0.2 shows:

GL Intercept Log. Version : 1.02 Compile Date: Nov 5 2011 Run on: Tue Nov 08 11:46:38 2011

===================================================
GL ERROR - Function glGetString(GL_VENDOR) generated error GL_INVALID_ENUM

Log End.

And the full log:

===============================================================================
GLIntercept version 1.02 Log generated on: Tue Nov 08 11:46:38 2011

===============================================================================

wglChoosePixelFormat(58012B14,0019FC08)=2
wglSetPixelFormat(58012B14,2,0019FC08)=true
wglCreateContext(58012B14)=00010000
wglMakeCurrent(58012B14,00010000)=true
wglGetProcAddress(“wglChoosePixelFormatARB”)=10025340
wglGetProcAddress(“wglCreateContextAttribsAR…”)=10025370
wglChoosePixelFormatARB(1C012D15,0019FABC,00000000,1,0019FAB0,0019FAA4)=true
wglSetPixelFormat(1C012D15,2,0019FA74)=true
wglCreateContextAttribsARB(1C012D15,00000000,0019FA48)
----->wglCreateLayerContext(1C012D15,0)=00010001 =00010001
wglMakeCurrent(1C012D15,00010001)=true
wglDeleteContext(00010000)=true
glGetString(GL_VENDOR)=“ATI Technologies Inc.” glGetError() = GL_INVALID_ENUM
glGetError()=GL_INVALID_ENUM
glGetString(GL_RENDERER)=“AMD Radeon HD 6800 Series”
glGetError()=GL_NO_ERROR
glGetString(GL_VERSION)=“3.3.11079 Core Profile Fo…”
glGetError()=GL_NO_ERROR
glGetString(GL_SHADING_LANGUAGE_VERSION)=“4.10”
glGetError()=GL_NO_ERROR
wglMakeCurrent(1C012D15,00000000)=true
wglDeleteContext(00010001)=true

What is really strange is that the app itself doesn’t receive that error when launched separately. Only when under GLIntercept.

No, you don’t need to create a dummy window to create a GL 3.3 context.
http://www.opengl.org/wiki/Tutorials

and more specifically
http://www.opengl.org/wiki/Tutorial:_OpenGL_3.1_The_First_Triangle_(C%2B%2B/Win)

and
http://www.opengl.org/wiki/Tutorial1:_Rendering_shapes_with_glDrawRangeElements,_VAO,_VBO,_shaders_(C%2B%2B_/_freeGLUT)

No, you don’t need to create a dummy window to create a GL 3.3 context.

You do if you want it to be a core context. Or to have an sRGB framebuffer. Or to have multisampling.

Basically, you should always use the dummy method, even if you’re not really using it.

Sure, you need it for multisampling because you need to choose a pixelformat that supports multisampling and since Microsoft doesn’t let us set the pixel format on the same window twice, you need to go with the “dummy window” hack.

I don’t know about sRGB framebuffer. Does that require choosing the right pixelformat?

If you don’t need multisampling, then all you need to do is make a old GL context, get all function pointers, destroy the context, use wglCreateContextAttribsARB to create your GL 3.3 context.

If you don’t need multisampling, then all you need to do is make a old GL context, get all function pointers, destroy the context, use wglCreateContextAttribsARB to create your GL 3.3 context.

But you need to set a pixel format on the window to create a context.

Indeed! And just once for each window.

To be more precise:

  • Choose pixel format
  • Set pixel format
  • Create an old GL context
  • Make it current
  • Set attributes for the new context and make it with wglCreateContextAttribsARB()
  • Deactivate and destroy the old context
  • Make new context current
  • Grab the other pointers (if you do it when the old GL context is current you’ll get error in some debuggers (gDEBugger, for example), since the functions will be used in another context).

We have to make dummy window only when we need to set pixel format that is not “the final solution”, since we cannot change it. For the multisampling we have to use wglChoosePixelFormatARB() in order to retrieve desired pixel format. wglChoosePixelFormatARB() requires active GL context. GL context requires pixel format to be set for a window. The reason for the dummy window is obvious, isn’t it?
For multisampling the procedure is the following:

  • Create dummy window (or use some other for the purpose),
  • Choose pixel format
  • Set pixel format (for the dummy/other window)
  • Create an old GL context
  • Make it current
  • Choose (multisampled)pixel format
  • Set pixel format (for the destination window)
  • Set attributes for the new context and make it with wglCreateContextAttribsARB()
  • Deactivate and destroy the old context
  • Make new context current
  • Grab the other pointers

Thank you all for your feedback.
And yes. I doo need multisampling (at least) and maybe sRGB in future so i wanted to get this initialization stuff cleared for me.
The procedure is exactly the same as described by Aleksandar, for one exception - it doesn’t completely work as intended.
I have experimented with many different combinations of creating dummy window, destroying dummy window, context creation/activation/deletion. After all of these i came to a few conclusions:

  1. 2.x context and glGetString() after it, works normally and without any errors from gDebugger and GLIntercept, but not 3.x context (i have tried 1/2/3 versions).
  2. 3.x context works without errors ONLY if i set it up as a DEBUG/COMPATIBILITY. And exactly both should be set. No other combinations work.

What i really don’t understand is why this GL_INVALID_ENUM error shows up ONLY under GLIntercept or gDebugger. Why it isn’t showing up without them?

You would have to look into the code of GLIntercept to figure that out and the code itself looks complex. Good luck.
Of course, gDebugger is not open source so you would have to ask them.

It is up to them to document their product behavior (GLIntercept and gDebugger).

So if i get you right, you mean that i shouldn’t bother about it at all and everything i made is correct?

I just tried your sample app and had no issues on Nvidia.

One thing you can do is use GLIntercept against itself - Create a new directory some where and put the opengl32.dll and gliConfig.ini file (full debug version) in the directory.

Then in the original gliConfig file, point the system lib at the new one:

GLSystemLib = “<newdir>\opengl32.dll”

Run the app, then afterwards compare the two glintercept logs. - the differences will contain what internal OpenGL calls GLIntercept made. (and which one caused the GL error)

I just tried your sample app and had no issues on Nvidia.

Yep. It works perfectly on NV.

One thing you can do is use GLIntercept against itself - Create a new directory some where and put the opengl32.dll and gliConfig.ini file (full debug version) in the directory.

Then in the original gliConfig file, point the system lib at the new one:

GLSystemLib = “<newdir>\opengl32.dll”

Run the app, then afterwards compare the two glintercept logs. - the differences will contain what internal OpenGL calls GLIntercept made. (and which one caused the GL error)

Did that. The new log file (the one in the new folder) shows this:

GL Intercept Log. Version : 1.02 Compile Date: Nov 5 2011 Run on: Thu Nov 10 10:07:47 2011

===================================================
GL ERROR - Function glGetString(GL_VERSION) generated error GL_INVALID_ENUM
GL ERROR - Function glGetIntegerv(GL_MAX_TEXTURE_UNITS,01274B98) generated error GL_INVALID_ENUM

Log End.

The other log file is pretty big and also contains some errors reported by glGetError().

Where in relation to the original log file are the calls to glGetString(GL_VERSION) and glGetIntegerv(GL_MAX_TEXTURE_UNITS, ) that are generating the error?

Straight after the first wglMakeCurrent? Anyway, find the place, then manually call these functions and see if it also results in a GL_ERROR (with no debugger attached)

Can you zip up the second log and post it here? If it is too big - PM me an I’ll give you my email.

In the original file (the one that is generated from my app, if that’s what you are talking about) there are no glGetIntegerv() functions at all. Infact, there is just one glGetString() declared and one glGetError(). No other GL functions used or even declared (i use my own declarations and no gl.h and such). These errors appear from the GLIntercept itself (they are in the new log file). The old log file doesn’t have them.

No functions in my app generate the errors. This is the most strangest thing. I have another project where i have many GL functions called and no errors at all. But they sometimes appear under GLIntercept or gDebugger.

Sure. It’s not that big. It’s just big enough to post it directly.

No, what I meant was, can you you change your app to include these OpenGL calls - to see if it is some bug in the AMD driver.

Looking closer at that log file it seems that it is the GL_MAX_TEXTURE_UNITS call - to verify can you disable the image logger (in both config files if running with both) and see if you still get the error.

ImageLog
{
LogEnabled = False;

I have had trouble with Nvidia allowing old texture enums in core contexts (and as I develop on Nvidia I never see them)

Hmm… Looks like that’s it!
I’ve disabled ImageLogging in GLIntercept and it doesn’t report errors anymore. I’ve checked both log files.
Then i’ve added glGetIntegerv(GL_MAX_TEXTURE_UNITS) to my app and after calling it, my glGetError() returnes GL_INVALID_ENUM even without GLIntercept attached. When attached it also catches this error.

OK, so it seems it was just a simple bug in my code to do with core contexts (or perhaps also a bug in Nvidia for still allowing that enum)

I’ll do another update when I have some time. (although technically I do have a disclaimer on the website about stuff not working in core contexts :slight_smile: )

Thanks for your help!
I guess now i can freely continue on conquering the world…

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.