Okay, so here’s my problem. I’m trying to render points (GL_POINTS) to a 3D texture but get an INVALID_OPERATION on either of glDrawArraysInstanced or glDrawArrays. Here’s what’s happening:
- create a vbo of a list of points and two extra floats for texture coordinates that i use in my shader. that’s 5 floats per point, 3 for position, 2 for texture coordinates
- create a 256x256x256 3d texture of internal format GL_RGBA8 to render to
3)set viewport to the width and height of my 3d texture (each layer’s width and height). 256x256 - on the frame buffer object I already have created I call glFramebufferTexture( GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, 3Dtexture, 0 ) to bind all layers of the texture. Also, I enable that color attachment as the only draw buffer using glDrawBuffers(…)
- call glDrawArrays or glDrawArraysInstanced with my vbo bound using GL_POINTS. get an invalid operation from opengl. (if I replace this step with glClear, the clear call works and clears my texture)
- unbind the frame buffer
I am using a geometry program that outputs gl_Layer so that each fragment knows where it’s going in the 3D texture. The program compiles and links correctly. The program is bound before I call glDrawArrays / glDrawArraysInstanced. The frame buffer is also frame buffer complete.
I really need to be able to output gl_Layer in my geometry program and I just can’t figure out why this is not working.
I’ve checked to see if my VBO is proper and it seems to be. I’ve uploaded contiguous data where every 20 bytes defines a point to render with the first 12 bytes the position and the next 8 the texture coordinates. Also, setting glVertexAttribPointer is called correctly for both the positions and texture coordinates (correct start offset and stride). Also, both vertex attribute arrays are enabled.
I really don’t know what’s going on. Does the overview of the process make sense? Is there something I’m missing there?
This is all done using opengl 3.2 with latest nvidia drivers
Here is a code listing of GL calls
vBuff = glGenBuffers(1);
float vertexData[] = { 0.55, 0.24, 0.123, 0.5, 0.5,
0.57, 0.24, 0.123, 0.51, 0.5,
...... // 400 of these
};
glBindBuffer(GL_ARRAY_BUFFER, vBuff);
glBufferData( GL_ARRAY_BUFFER, 8000, vertexData, GL_DYNAMIC_DRAW);
glVertexAttribPointer( 0, 3, GL_FLOAT, GL_FALSE, 20, 0);
glEnableVertexAttribArray( 0 );
glVertexAttribPointer( 1, 2, GL_FLOAT, GL_FALSE, 20, 12);
glEnableVertexAttribArray( 1 );
texture3D = glGenTextures(1);
glBindTexture( GL_TEXTURE_3D, texture3D );
glTexImage3D( GL_TEXTURE_3D, 0, GL_RGBA8, 256, 256, 256, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
// framebufferId I have, I'll skip creation code cause I know it's 100% correct
// It's bound
glBindFramebuffer( GL_FRAMEBUFFER, framebufferId );
glFrameBufferTexture( GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, texture3D, 0 );
drawBuffers[] = { GL_COLOR_ATTACHMENT0 };
glDrawBuffers( drawBuffers );
glBindTexture( GL_TEXTURE_3D, 0);
glEnable( GL_BLEND );
glBlendFunc( GL_SRC_ALPHA, GL_ONE );
glViewport( 0, 0, 256, 256 );
// Setup shader program uniforms
// This involves binding some textures setting their uniforms, and setting other uniforms
glDrawArrays( GL_POINTS, 0, 400 );
// or
// glDrawArraysInstanced( GL_POINTS, 0, 400, 2 );
// both give invalid operation