gl3w & glut

Hi,

I’ve some trouble to use gl3w and glut in the same program.

Here an example of a simple program using glut and gl3w (Very close to the one on the gl3w’s website).


#include <stdio.h>
#include <GL3/gl3w.h>
#include <GL/glut.h>

void reshape(int w, int h)
{

}

void display()
{

	glutSwapBuffers();
}

int main(int argc, char **argv)
{
	int width = 1024, height = 512;
	glutInit(&argc, argv);
	glutInitDisplayMode(GLUT_RGBA | GLUT_DEPTH | GLUT_DOUBLE);
	glutInitWindowSize(width, height);
	glutCreateWindow("cookie");

	glutReshapeFunc(reshape);
	glutDisplayFunc(display);

	if (gl3wInit()) {
		    fprintf(stderr, "failed to initialize OpenGL
");
		    return -1;
	}
	if (!gl3wIsSupported(3, 2)) {
		    fprintf(stderr, "OpenGL 3.2 not supported
");
		    return -1;
	}
	printf("OpenGL %s, GLSL %s
", glGetString(GL_VERSION),
		   glGetString(GL_SHADING_LANGUAGE_VERSION));

	// ...

	glutMainLoop();
	return 0;
}

Compilation and link are ok, but I get a seg. fault in glutSwapBuffers()) when I try to launch. I think it comes from the fact gl3w redefines/reloads functions address at runtime, and glut becomes confused. But I’m not sure. What are you opinion ?

BTW, is it not a paradox to use glut and gl3w at the same time ? I mean glut use glVertex and so forth to handle functions like glutWireSphere. But gl3w load an OpenGL 3.2 Core context, which removes all those functions.

I’ve forgotten : I use freeglut under Ubuntu 9.10 & 10.04.

I’ve just read that on GameDev :

I have learned the OpenGL core profile for several weeks. I use freeglut (codes from SVN), GLEW 1.5.4, GLM 0.9Beta2 on OpenSUSE 11.2 with Nvidia GTX260. After some hard work, everything is OK. It is necessary to call "glutInitContextVersion(3,3); glutInitContextProfile(GLUT_CORE_PROFILE); …

Thus according to that I’ve updated the code to :


#include <stdio.h>
#include <GL3/gl3w.h>
#include <GL/glut.h>
#include <GL/freeglut_ext.h>

void reshape(int w, int h)
{
	glViewport(0, 0, w, h);
}

void display()
{
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	glutSwapBuffers();
}

int main(int argc, char **argv)
{
	int width = 1024, height = 512;
	glutInit(&argc, argv);
	glutInitContextVersion(3,2); 
	glutInitContextProfile(GLUT_CORE_PROFILE);
	glutInitDisplayMode(GLUT_RGBA | GLUT_DEPTH | GLUT_DOUBLE);
	glutInitWindowSize(width, height);
	glutCreateWindow("cookie");

	glutReshapeFunc(reshape);
	glutDisplayFunc(display);

	if (gl3wInit()) {
		    fprintf(stderr, "failed to initialize OpenGL
");
		    return -1;
	}
	if (!gl3wIsSupported(3, 2)) {
		    fprintf(stderr, "OpenGL 3.2 not supported
");
		    return -1;
	}
	printf("OpenGL %s, GLSL %s
", glGetString(GL_VERSION),
		   glGetString(GL_SHADING_LANGUAGE_VERSION));

	glutMainLoop();
	return 0;
}

But it still doesn’t work !

I tried gl3w.c just now and yes it compiles but fails with a seg fault as you say. It just seems broken to me.

Anyhow, gl3w seems redundant on Ubuntu – just manually get gl3.h and put in your /usr/include/GL3/ folder. I do this every now and then with the simple command line code (also updates “ext” stuff at same time, why not, it easy to do)


wget http://www.opengl.org/registry/api/glext.h
wget http://www.opengl.org/registry/api/glxext.h
wget http://www.opengl.org/registry/api/gl3.h

sudo cp glext.h /usr/include/GL/
sudo cp glxext.h /usr/include/GL/
sudo cp gl3.h /usr/include/GL3/

Then on my Ubuntu 10 machine your code is simplified, compiles with “g++ main.c -lGL -lglut” and works as


#include <stdio.h>
// Ensure we are using opengl's core profile only
#define GL3_PROTOTYPES 1
#include <GL3/gl3.h>
#include <GL/glut.h>
#include <GL/freeglut_ext.h>

void reshape(int w, int h)
{
	glViewport(0, 0, w, h);
}

void display()
{
	glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	glutSwapBuffers();
}

int main(int argc, char **argv)
{
	int width = 1024, height = 512;
	glutInit(&argc, argv);

	glutInitContextVersion(3,2); 
	glutInitContextProfile(GLUT_CORE_PROFILE);

	glutInitDisplayMode(GLUT_RGBA | GLUT_DEPTH | GLUT_DOUBLE);
	glutInitWindowSize(width, height);
	glutCreateWindow("cookie");

	printf("OpenGL %s, GLSL %s
", glGetString(GL_VERSION),
		   glGetString(GL_SHADING_LANGUAGE_VERSION));

	glutReshapeFunc(reshape);
	glutDisplayFunc(display);

	glutMainLoop();
	return 0;
}


which pops up a window and reports “OpenGL 3.2.0 NVIDIA 195.36.24, GLSL 1.50 NVIDIA via Cg compiler”

ps for alternatives taking a look at GLEW may be helpful or GLEE. Last I looked though only GLEW was at >3.2 but GLEE was at 3.0.

Hey cool ! I haven’t try yet, but I will do it tomorrow. Anyway thank you !

BTW, do you have an idea why there is a conflict with gl3w ?

GLEW is at 4.0 but I still have odd problems with it but I wonder if it comes from SDL…

How good is freeglut for OpenGL 3 > support?

Thanks Groovounet, I wasn’t sure where GLEW was at (GL 4). Having said that I have fallen behind and my GPU is at 3.2 so I haven’t tried any of the GL4 stuff yet. May be time to get a new video card.

Back to the original post – if it helps to get GL3 up and running, I added below two “helloworld” codes to push some gl3 triangles that works on my machine. The first uses freeglut, the second uses SDL if you feel like trying it out. You should see a blue rectangle.


#include <stdio.h>
// Ensure we are using opengl's core profile only
#define GL3_PROTOTYPES 1
#include <GL3/gl3.h>
#include <GL/glut.h>
#include <GL/freeglut_ext.h>

size_t VertexArrayCount; // 3 for {x y z}
GLuint vao;

void initGL() {
  //Create shaders and shader program
  GLuint vshader(glCreateShader(GL_VERTEX_SHADER));
  GLuint fshader(glCreateShader(GL_FRAGMENT_SHADER));
  GLuint program(glCreateProgram());

  const GLchar *vshader_source[] = 
  {
  "#version 150 core
"
  "
"
  "in vec3 vert;
"
  "
"
  "void main() {
"
  "  gl_Position=vec4(vert,1.);
"
  "}
"
  "
"
  };
  glShaderSource(vshader,1,vshader_source,NULL);

  const GLchar *fshader_source[] = 
  {
  "#version 150 core
"
  "out vec4 fragcolor;
"
  "
"
  "void main() {
"
  "
"
  "  fragcolor=vec4(0.0f,0.0f,1.0f,0.0f);
"
  "}
"
  "
"
  };
  glShaderSource(fshader,1,fshader_source,NULL);

  glCompileShader(vshader);
  glCompileShader(fshader);

  glAttachShader(program,vshader);
  glAttachShader(program,fshader);
  glLinkProgram(program);
  glUseProgram(program);

  //Get handles to shader uniforms
  //... none for this simple vert/frag shader

  //Datas destioned for video memory, can be local (and lost after bound to GPU!). 
  #define R 0.9
  GLfloat vertices[] = { // in vec3 vert;
    -R,  R, 0.0, // xyz 
    -R, -R, 0.0, 
     R,  R, 0.0,
     R, -R, 0.0
   };
   VertexArrayCount=sizeof(vertices)/sizeof(GLfloat)/3; // 3 for {x y z}

  //Create geometry vertex array using Model definition
  //use global GLuint vao;
  glGenVertexArrays(1,&vao);
  glBindVertexArray(vao);

  //in vec3 vert;
  GLuint bon_vert; // buffer object name
  glGenBuffers(1,&bon_vert);
  glBindBuffer(GL_ARRAY_BUFFER,bon_vert);
  glBufferData(GL_ARRAY_BUFFER,sizeof(GLfloat)*3*VertexArrayCount,vertices,GL_STATIC_DRAW);
  const GLint loc_vert(glGetAttribLocation(program,"vert"));
  glVertexAttribPointer(loc_vert,3,GL_FLOAT,GL_TRUE,0,NULL);
  glEnableVertexAttribArray(loc_vert);
}

void reshape(int w, int h)
{
  glViewport(0, 0, w, h);
}

void display()
{
  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

  glBindVertexArray(vao);
  glDrawArrays(GL_TRIANGLE_STRIP,0,VertexArrayCount);

  glutSwapBuffers();
}

int main(int argc, char **argv)
{
  glutInit(&argc, argv);

  glutInitContextVersion(3,2); 
  glutInitContextProfile(GLUT_CORE_PROFILE);

  glutInitDisplayMode(GLUT_RGBA | GLUT_DEPTH | GLUT_DOUBLE);
  glutCreateWindow("gl3.3 freeglut");

  printf("OpenGL %s, GLSL %s
", glGetString(GL_VERSION),
       glGetString(GL_SHADING_LANGUAGE_VERSION));

  initGL();
  glutReshapeFunc(reshape);
  glutDisplayFunc(display);

  glutMainLoop();
  return 0;
}


#define GL3_PROTOTYPES
#include <GL3/gl3.h> // wget http://www.opengl.org/registry/api/gl3.h
#include <SDL.h>

int main() 
{
  //SDL Initialization
  SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION,3);
  SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION,2);
  SDL_Init(SDL_INIT_VIDEO);
  const int r_width(640),r_height(640);
  SDL_WindowID window(SDL_CreateWindow("openGL3 HelloWorld",SDL_WINDOWPOS_CENTERED,SDL_WINDOWPOS_CENTERED,r_width,r_height,SDL_WINDOW_OPENGL|SDL_WINDOW_SHOWN));
  SDL_GLContext glcontext(SDL_GL_CreateContext(window));

  //Create shaders and shader program
  GLuint vshader(glCreateShader(GL_VERTEX_SHADER));
  GLuint fshader(glCreateShader(GL_FRAGMENT_SHADER));
  GLuint program(glCreateProgram());

  const GLchar *vshader_source[] = 
  {
  "#version 150 core
"
  "
"
  "in vec3 vert;
"
  "
"
  "void main() {
"
  "  gl_Position=vec4(vert,1.);
"
  "}
"
  "
"
  };
  glShaderSource(vshader,1,vshader_source,NULL);

  const GLchar *fshader_source[] = 
  {
  "#version 150 core
"
  "out vec4 fragcolor;
"
  "
"
  "void main() {
"
  "
"
  "  fragcolor=vec4(0.0f,0.0f,1.0f,0.0f);
"
  "}
"
  "
"
  };
  glShaderSource(fshader,1,fshader_source,NULL);

  glCompileShader(vshader);
  glCompileShader(fshader);

  glAttachShader(program,vshader);
  glAttachShader(program,fshader);
  glLinkProgram(program);
  glUseProgram(program);

  //Get handles to shader uniforms
  //... none for this simple vert/frag shader

  //Datas destioned for video memory, can be local (and lost after bound to GPU!). 
  #define R 0.9
  GLfloat vertices[] = { // in vec3 vert;
    -R,  R, 0.0, // xyz 
    -R, -R, 0.0, 
     R,  R, 0.0,
     R, -R, 0.0
   };
   size_t VertexArrayCount=sizeof(vertices)/sizeof(GLfloat)/3; // 3 for {x y z}

  //Create geometry vertex array using Model definition
  GLuint vao;
  glGenVertexArrays(1,&vao);
  glBindVertexArray(vao);

  //in vec3 vert;
  GLuint bon_vert; // buffer object name
  glGenBuffers(1,&bon_vert);
  glBindBuffer(GL_ARRAY_BUFFER,bon_vert);
  glBufferData(GL_ARRAY_BUFFER,sizeof(GLfloat)*3*VertexArrayCount,vertices,GL_STATIC_DRAW);
  const GLint loc_vert(glGetAttribLocation(program,"vert"));
  glVertexAttribPointer(loc_vert,3,GL_FLOAT,GL_TRUE,0,NULL);
  glEnableVertexAttribArray(loc_vert);

  //Render loop
  SDL_Event event;
  do {
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    glBindVertexArray(vao);
    glDrawArrays(GL_TRIANGLE_STRIP,0,VertexArrayCount);

    SDL_GL_SwapWindow(window);

    SDL_PollEvent(&event); //non-blocking
  } while (event.type!=SDL_MOUSEBUTTONDOWN);

  //Quiting, so cleanup resources
  glDeleteProgram(program);
  glDeleteShader(fshader);
  glDeleteShader(vshader);
  glDeleteBuffers(1,&bon_vert);
  glDeleteVertexArrays(1,&vao);
  SDL_GL_DeleteContext(glcontext);
  SDL_DestroyWindow(window);
  SDL_Quit();
}

The problem with gl3w under Linux is now fixed. Sorry for the inconsistence.

The latest release also supports building and using gl3w as a shared library. Check it out at https://github.com/skaslev/gl3w