glBindFragDataLocation segmentation fault

I’ve got all of my code to gl3+ compliance now, I think, except I’m still using fragdata in the fragment shader because my glBindFragDataLocation function causes a segmentation fault when I try to use it. Also, I’m using glew, and although ‘glewinfo | grep glBindFragDataLocation’ says I’m good for both glBindFragDataLocation and glBindFragDataLocationEXT, when I do a ‘printf("%ld",(long)glBindFragDataLocation);’ I get null.

So… help?

This is the area of code around my problem:
GLuint myShaderProgram;
myShaderProgram = glCreateProgram();
glAttachShader(myShaderProgram, myVertShader);
glAttachShader(myShaderProgram, myFragShader);

glBindAttribLocation(myShaderProgram, 0, "inVertex");
glBindAttribLocation(myShaderProgram, 1, "inColor");
glBindFragDataLocation(myShaderProgram, 0, "outColor"); //Comment this out, and it works. (with gl_FragData[0]).
printf("%ld",(long)glBindFragDataLocationEXT);//0!?

glLinkProgram(myShaderProgram);
glUseProgram(myShaderProgram);

and this is my fragment shader:
#version 150 core

in vec3 exColor;
out vec4 outColor;

void main() {
vec3 tempC;
tempC = exColor;
tempC.r = tempC.r * 0.5;
tempC.g = tempC.g * 0.5;
tempC.b = tempC.b * 0.5;
gl_FragData[0] = vec4(tempC,1.0);
outColor = vec4(tempC,1.0);
}

Kinda new to this… so any help is appreciated!

#version 150 core
So, I assume it’s a G80+ with drivers 190.18.05 for Linux?
I’m with 190.57 for winxp, and glBindFragDataLocation is present (as in “not null”).

P.S but I don’t use GLEW

How are you getting a gl3 context? Are you using freeglut 2.6 or latest SDL? or other method?

Why are you checking for the version with EXT appended


printf("glBindFragDataLocationEXT %ld",(long)glBindFragDataLocationEXT);

but actually calling glBindFragDataLocation without the EXT appended?

I am on linux with GeForce 9600 GT and OpenGL version 3.2.0 NVIDIA 190.18.05. I run code using glBindFragDataLocation without an errors. Note instead of glew I take an alternative approach – glew is not quite up to supporting all of gl3.2 features – although looking at their webpage there was a post about a version that is available but not as a stable release yet. I use the gl3.h instead of glew for this reason … First I make sure I have the latest gl3.h headers


wget http://www.opengl.org/registry/api/glext.h
wget http://www.opengl.org/registry/api/glxext.h
wget http://www.opengl.org/registry/api/gl3.h

sudo cp glext.h /usr/include/GL/
sudo cp glxext.h /usr/include/GL/
sudo cp gl3.h /usr/include/GL3/

Then in my code I remove all glew and do instead


#define GL3_PROTOTYPES
#include <GL3/gl3.h>
// this replaces GL/GL.h, read the gl3.h itself to see how to properly use it
// and compile with g++ foo.cpp  -lGL 

then you have the gl3 functions available in linux

Thanks for the replies.

I’m running x64 9.10 Kubuntu Linux with latest updates kernel and all. I have a 9800GTX+ with latest(190.42 http://www.nvidia.com/object/linux_display_amd64_190.42.html ) x64 Linux drivers.

I’m getting my gl3 forward compat context via glfw lite 2.6 latest off of svn, using the glfwOpenWindowHint or whatever (not at my computer atm). GL_VERSION does seem to follow these hints, and shows 3.2 at the moment.

Why are you checking for the version with EXT appended

I had been trying both and forgot to change back before posting. Both give me a segmentation fault, and are null when cast and printf-ed. Which is why I’m getting the segmentation fault I would guess… trying to call a null function pointer?

I also have the latest glew. I guess it’s screwing up? I was thinking about getting away from glew and using gl3.h and glfw’s multi-platform getproc stuff.

My main issue there is I don’t know how to use gl3.h with glfw since glfw.h includes gl.h for me. I guess replace the includes in the glfw source with gl3.h and recompile it?

Thanks again.

Found the problem, GLEW wasn’t setting the function pointer for some reason, pulled it in on my own and it works.

Think I’ll remove glew and just pull in all the functions I need manually.

For some reason though, I forgot that I am still only running an opengl 3.1 context. I can change it to 3.2 with the hint, and it shows up, but I get a black render. I thought it was due to using depreciated functions, but I’m pretty sure all of my code is 3.2+ compliant now… and it still doesn’t work… :stuck_out_tongue:

Anyway, sure I’ll figure it out.

Thanks again