In the following code my goal is to initialize without an error. I am using the simplest, most minimal code possible (as far as I know):
#include<GL/glut.h>
int main(int argc, char *argv[])
{
glutInit(&argc, argv);
return 1;
}
glut32.dll in same folder as .exe
compiles fine, error:
Lighthouse3d_initialize.exe stopped working
mhagain
October 21, 2017, 12:58pm
2
Is your program 32-bit or 64-bit? What character set are you using? Unicode?
I’ve never been asked the Unicode question: it has never been an issue. I’ve done a plethora of successful GL demos without that being an issue. How do I get you this information?
I am pretty sure my program is 32 bit. How do I check this?
.exe is x86
glut32.dll is x86, as well
Amazingly, the addition of the following parameter completely resolved the problem and resulted in a running program:
#define GLUT_DISABLE_ATEXIT_HACK
to
#include <iostream>
#define GLUT_DISABLE_ATEXIT_HACK
#include <GL/glut.h>
int main(int argc, char *argv[])
{
std::cout<<"Hello GL World"<<std::endl;
glutInit(&argc, argv);
return 0;
}
MUST put a copy of glut32.dll in the same folder as project .exe or the runtime error will continue but for a different reason.
Here is the dummy argument version as above. But you have to add the #define GLUT_DISABLE_ATEXIT_HACK BEFORE #include <GL/glut.h>:
#include <iostream>
#define GLUT_DISABLE_ATEXIT_HACK
<GL/glut.h>
int main()
{
int argc = 1;
char *argv[1] = {(char*)"Something"};
glutInit(&argc, argv);
...
}