Weird segmentation faults when using OpenGL

Hello,

I’m learning to use modern OpenGL functionnalities (like Shaders and VBOs) at my university but I have some problems.

First of all I’m using Ubuntu and this is what the command glxinfo | grep version show :

server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
OpenGL version string: 1.4 Mesa 8.0.2

glxinfo | grep GL_ARB_VERTEX outputs the following :

 GL_ARB_vertex_program, GL_ARB_vertex_shader, GL_ATI_draw_buffers, 
    GL_ARB_texture_non_power_of_two, GL_ARB_vertex_buffer_object, 
    GL_APPLE_object_purgeable, GL_ARB_vertex_array_object, 

so I have the ARB extensions allowing me to use VBOs.

The problem now : the following code

#include <GL/glew.h>
#include <GL/glut.h>
#include <iostream>

class Vec3{
public:
	float m_x;
	float m_y;
	float m_z;

	Vec3(){}
	Vec3(float x, float y, float z){
		m_x=x;
		m_y=y;
		m_z=z;
	}
};

GLuint VBO;

static void CreateVertexBuffer()
{
	Vec3 vertices[1];
	vertices[0] = Vec3(0.0f, 0.0f, 0.0f);

	glGenBuffersARB(1, &VBO);
	glBindBufferARB(GL_ARRAY_BUFFER, VBO);
	glBufferDataARB(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
}

static void renderScene(){
	glClear(GL_COLOR_BUFFER_BIT);
	glEnableClientState(GL_VERTEX_ARRAY);
	glVertexPointer(3, GL_FLOAT, 0, 0);
	glDrawArrays(GL_POINTS, 0, 1);
	glDisableClientState(GL_VERTEX_ARRAY);
	glutSwapBuffers();

}
int main(int argc, char **argv) {
	glutInit(&argc, argv);
	glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA);
	glutInitWindowSize(400, 400);
	glutInitWindowPosition(50, 50);
	glutCreateWindow("Tutorial 2");
	glutDisplayFunc(renderScene);
	GLenum err = glewInit();
	if (err != GLEW_OK){
		std::cerr << "Error : " << glewGetErrorString(err) << std::endl;
		return -1;
	}

	glClearColor(1.0f, 0.0f, 0.0f, 0.0f);
	CreateVertexBuffer();
	glutMainLoop();
	return 0;
}

Works fine, but if I replace the lines

glEnableClientState(GL_VERTEX_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, 0);
glDrawArrays(GL_POINTS, 0, 1);
glDisableClientState(GL_VERTEX_ARRAY);

with the following :

glEnableVertexAttribArrayARB(0);
glVertexAttribPointerARB(0, 3, GL_FLOAT, GL_FALSE, 0, 0);
glDrawArrays(GL_POINTS, 0, 1);
glDisableVertexAttribArrayARB(0);

I get a Segmentation fault. Anybody knows why ?

Forgot to load the function pointers?

The problem is here:


	glutDisplayFunc(renderScene);
	GLenum err = glewInit();

Once you start giving GLUT callback functions, it is free to call them at any time. Like, for example, immediately. But since you haven’t finished loading OpenGL functions yet (ie: haven’t called glewInit), you can’t call those functions yet.

In short, move the registration of renderScene until after you’ve initialized GLEW.

How do I do that ?

I switched the two lines, and the result is still the same.

Wait, I didn’t see this:

OpenGL version string: 1.4 Mesa 8.0.2

glVertexAttribPointer and the other generic attribute stuff wasn’t core OpenGL until 2.0; until then, it came from ARB_vertex_shader. Naturally, they also had ARB suffixes.

However, I’d be more concerned about the fact that your hardware seems to be limited to GL 1.4. Are you using some kind of older Intel hardware?

I’m using a Dell Inspiron Mini 10. The lspci -v command indicates in the VGA compatible controller section:

00:02.0 VGA compatible controller: Intel Corporation N10 Family Integrated Graphics Controller (prog-if 00 [VGA controller])
	Subsystem: Dell Device 048e
	Flags: bus master, fast devsel, latency 0, IRQ 43
	Memory at f0200000 (32-bit, non-prefetchable) [size=512K]
	I/O ports at 18d0 [size=8]
	Memory at d0000000 (32-bit, prefetchable) [size=256M]
	Memory at f0000000 (32-bit, non-prefetchable) [size=1M]
	Expansion ROM at <unassigned> [disabled]
	Capabilities: <access denied>
	Kernel driver in use: i915
	Kernel modules: i915

Is it too old to use these functionalities ?

I guess that segmentation fault is because without a vertex shader the driver will still try to load the glVertexPointer data, the generic vertex attributes cannot be used with the fixed pipeline. Your driver reports the extensions, so the hardware should be able to do this.

According to the specification, your netbook should have GMA 500 graphics. In that case it is GL 2.0 compatible, since it supports SM 3.0 (although poorly).
But your kernel driver says something else:

i915 is an old chipset from Q1/2005 (for laptops). It is SM 2.0 compatible but with the software vertex shader emulation. In that case, it cannot use shaders in GL, and it is really GL 1.4 compatible (even VBO is not supported). So, try to update shipset drivers for your netbook (I really doubt Intel uses 7 years old chipset for the notebook released this year).

VBO is most likely supported but you have to use the ARB extension. The only reason Intel is GL 1.4 and not 1.5 (this is when VBO went core) is because 1.5 requires occlusion queries, which Intel hardware is incapable of.

If his driver crashes on occasion, with such a simple code, most likely the drivers are crap. I could suggest making your VBO larger, such as 1 KB.

I allocated 1024 bytes of data in glBufferDataARB and the result is still the same.

Edit :

I made an experiment : I read the tutorial from Alfonse Reinheart’s signature, took the first example (where we create 2 shaders and display a triangle using them), replaced all the necessary functions with their ARB equivalent, and it still doesn’t work.

This is getting really annoying -_-