glGenFrameBuffers(EXT) always fails!

Hi,

I have the same problem on Windows 7 and Linux, on ATI and NVIDIA, using any number of different context creation libraries (Glut, SDL, GLX, WGL) and up-to-date driver versions (I know at least some of those create valid contexts of the correct version, and I remember to call MakeCurrent()). glGenFramebuffers/glGenFrameBuffersEXT always fails. The address doesn’t return NULL either; I checked in gdb and with printf. Both function pointers exist, as they should; on the cards I’ve tried it with (Quadro FX 580 and Radeon HD5450) and in glxinfo it says the extensions are supported.

And yet when I run this code:

glViewport(0,0,RD_WINDOW_WIDTH,RD_WINDOW_HEIGHT);
				
				vbo=new unsigned int;
				GLuint *ibo= new unsigned int;
				 GLuint *pixelbuffers=new unsigned int[2];
				fbo=new unsigned int;
				
				GLenum err = glewInit();
				if (GLEW_OK != err)
				{
				  /* Problem: glewInit failed, something is seriously wrong. */
				  fprintf(stderr, "Error: %s
", glewGetErrorString(err));

				 }
				glGenBuffers(1,  vbo);
				
				
				glGenBuffers(1,ibo);
				glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,*ibo);
				programs=new unsigned int[4];
				glBindBuffer(GL_ARRAY_BUFFER,*vbo);
				glBufferData(GL_ARRAY_BUFFER, 1024*1024, 0, GL_STATIC_DRAW);
				glBindBuffer(GL_ARRAY_BUFFER,0);
					
				glGenFramebuffers(1,fbo);

It segfaults at that last line, regardless of where I put it or whether any of the other GL calls are included. If I replace it with the EXT version it still segfaults (traces back to the method ??, which seems to mean it segfaults in the driver somewhere, since that’s about the only relevant library I use that doesn’t export debug symbols. But that makes no sense, since it does the same on Windows). If I delete the line, it runs fine through several more OpenGL calls, copying buffers and generating textures, until I call glFrameBufferTexture2D or the EXT equivalent, and then gives up there for obvious reasons. I call glewInit after I create the context. I could find no other advice on google.

I am at a loss. Your attention to this issue is most gracious.
Regards,
Kevin

fbo=new unsigned int;
glGenFramebuffers(1,fbo);

is a VERY unusual way to retrieve the ID for the new FBO.
Better do it this way:

GLint fbo=0;
glGenFramebuffers(1, &fbo);

I guess, you somehow got confused by this pointer mess and let GL write into memory you don’t own.

vbo=new unsigned int;
GLuint *ibo= new unsigned int;
GLuint *pixelbuffers=new unsigned int[2];
fbo=new unsigned int;

What is this? Why are you allocating GLuints instead of just putting them on the stack? Unless you have a particular need to allocate memory for them, just use them the same way everyone else does:


GLuint fbo;
glGenFramebuffers(1, &fbo);

Hi,
Thanks. I have tried the way you suggested, however it still does not work.

edit: But you sparked an idea in my mind. It might well be heap corruption; I’ll try to optimize a bit for that and then we’ll see if it still doesn’t work.