How to compile opengl 1.2 with Visual Studio 6

hi,
My problem is that I cannot find any way to compile OpenGL 1.2 gl functions (also gl enum types) with Ms Visual Studio 6.0. My OS is Windows Xp Sp1 and I know Microsoft does not support OpenGL any more. Adding new enums to gl.h file and using wglGetProcAddress to find entry points does not work since microsoft’s opengl32.dll at windows\system32 has missing functions as it is still OpenGl 1.1 compatible. I tried Mesa library but it is much more slower(soft implementation) and behaves different than Microsoft’s opengl32.dll so it is not an option.
I also wonder if standart opengl32.dll is 1.1 compliant then how do 1.2-1.5 demos and games run on windows?
Does anybody knows what to do to compile and use gl 1.2 functions with Visual Studio?

Apparently you missed something, because everybody uses wglGetProcAdress on windows.

Just read the faq :
http://opengl.org/resources/faq/getting_started.html

I used Visual Studio’s depends tool to find if the new 1.2 functions really have entry point in opengl32.dll but there were only the 1.1 functions inside. Also when I change th gl.h to a 1.2 gl.h and try to use a 1.2 specific gl enum type with glget function I got gl invalid enum error. This means even gl enums are defined in gl.h they cannot be recognized by the opengl32.dll. Did I miss anything? or Do I need another opengl32.dll?

You don’t need a new opengl32.dll, but you need to have installed some opengl driver that supports OpenGL 1.2 (or whatever you want) in order to be able to use such functions or enums. If you have such driver then you can directly use the enums (even with the 1.1 gl functions) and they will work as expected. For the new functions you can get their pointers by wglGetProcAddress and then use them via these pointers. You can’t call them directly as the 1.1 ones since they opengl32.dll doesn’t provide direct entry points for them. But that’s really not a problem, because this can easely be made transparent for your sorce code. For example include some your auto-generated header instead of GL/gl.h which declares all gl functions as pointers (e.g. “void (APIENTRY * glBegin) (GLenum mode);”, where APIENTRY is defined to __stdcall for win32 and an empty macro for linux) and have these pointers defined and initialized somewhere. For the initializing porpose you can write your function to get the addresses like this:

void *getfnaddr(const char *name)
{
void *p = GetProcAddress(opengl32_handle, name);
if (!p) p = wglGetProcAddress(name);
return p;
}

and then one long function with the initialization code:

void init_gl_pointers()
{
*(void **)&glBegin = getfnaddr(“glBegin”);
*(void **)&glEnable = getfnaddr(“glEnable”);
// … and so on
}

you should try both GetProcAddress and wglGetProcAddress because the first one works only for the 1.1 functions and the second one works only for the others.

then you use all gl funcs in all other sources as you would use the 1.1 funcs when include GL/gl.h.

As an addendum to the advice already given, the OP may want to consider one of the GL extension loading libraries floating around. I’m using GLee and loving it. The way it exposes functionality to the programmer is intuitive and very easy to use. I highly recommend it.

GLee Download Page

Thanks all you guys.
It seems that my graphics card at the office doesn’t have a new opengl driver. Because even wglGetProcAddress works w/o returning null, enum parameters are not recognized. I used glHistogram for testing. When I give GL_Histogram parameter I got a GL_INVALID_ENUM error. However same code works at home.
By the way glee looks pretty and makes extensions easy to call.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.