Issue with VBOs

Hey Everyone,

I am working on an edgebuffer to draw silhouettes and artistic lines on meshes. I set up my data structure for differentiating between different lines and now, I’m trying to draw the lines. I started out with something simple: Draw the points while using glPointSize( 10.0 ); to increase the size and I can see if they are being drawn in the right place.

I attach an edgebuffer object to an object being drawn in the scene, I set up a simple shader for the lines as follows:

Vertex Shader:

#version 430 core

uniform mat4 modelview;
uniform mat4 projection;

layout (location = 0) in vec3 vertex;
layout (location = 1) in vec3 normal;

//out vec3 V;
//out vec3 N;
//out vec3 ObjN;

void main(void)
{
	//V = vertex;
	
	// create the Normal Matrix to correct Normal into camera space
	//mat3 normalMatrix = transpose(inverse(mat3(modelview)));
	//N = normalize( normalMatrix * normal );
	//ObjN = normal;
	
    gl_Position = projection * modelview * vec4( vertex, 1.0 );
}

Fragment Shader:

#version 430 core

out vec4 color;

void main(void)
{    
    color = vec4( 0.0 );
}

This should simply move the vertex into the proper position and color it black. I tested this shader out on a simple world coordinates reference which draws a line in each of the x, y and z coordinates, the lines showed up black; everything good so far.

For simplicity’s sake, I created a seperate vertexarray, vertexbuffer, normalbuffer and indicesbuffer for the EdgeBuffer class instead of latching onto the object’s vertices and vertex array. The indices I set as dynamic_draw since I plan on changing that multiple times during the draw stage as I only want to draw certain lines in a certain way. This is how I generate my data for OpenGL in my edgebuffer:


//Constructor
EdgeBuffer::EdgeBuffer( const vector< vec3 >& pVerts, const vector< vec3 >& pNormals )
{
	// Gen Vertex Array
	glGenVertexArrays( 1, &m_iVertexArray );
	glBindVertexArray( m_iVertexArray );

	// Vertices Buffer
	glGenBuffers( 1, &m_iVerticesBuffer );
	glBindBuffer( GL_ARRAY_BUFFER, m_iVerticesBuffer );
	glBufferData( GL_ARRAY_BUFFER, pVerts.size() * sizeof( glm::vec3 ), pVerts.data(), GL_STATIC_DRAW );	
	glVertexAttribPointer( 0, 3, GL_FLOAT, GL_FALSE, 0, NULL );
	glEnableVertexAttribArray( 0 );

	// Normals Buffer
	glGenBuffers( 1, &m_iNormalsBuffer);
	glBindBuffer( GL_ARRAY_BUFFER, m_iNormalsBuffer );
	glBufferData( GL_ARRAY_BUFFER, pNormals.size() * sizeof( glm::vec3 ), pNormals.data(), GL_STATIC_DRAW );	
	glVertexAttribPointer( 1, 3, GL_FLOAT, GL_FALSE, 0, NULL );
	glEnableVertexAttribArray( 1 );

	// Indices Buffer
	glGenBuffers( 1, &m_iIndicesBuffer);
	glBindBuffer( GL_ELEMENT_ARRAY_BUFFER, m_iIndicesBuffer );
	glBufferData( GL_ELEMENT_ARRAY_BUFFER, m_vDrawIndices.size() * sizeof( uvec2 ), m_vDrawIndices.data(), usage );
	// Note: m_vDrawIndices are empty at this point.  
	// I wasn't sure if I should still use them to initialize the buffer data, 
	// or just initialize the data with NULL.

	glBindVertexArray( 0 );
}

Here’s where I draw my edgebuffer:


void EdgeBuffer::drawEdgeBuffer()
{
	// Get current bindings as a restore point
	GLint iCurrProgBinding = 0, iCurrVABinding = 0;
	glGetIntegerv( GL_VERTEX_ARRAY_BINDING, &iCurrVABinding );
	glGetIntegerv( GL_CURRENT_PROGRAM, &iCurrProgBinding );

	// Bind Edge Buffer data
	glBindVertexArray( m_iVertexArray );
	// My Shader Manager is a Singleton and keeps reference to each shader. 
	// This call returns the program value for my edge shader.
	glUseProgram( ShaderManager::getInstance()->getProgram( ShaderManager::eShaderType::EDGE_SHDR ) );

	// Fetch Indices
	// This will go through the Edge Buffer data structure and set up m_vDrawIndices with the proper indices to draw.
	// m_vDrawIndices is a vector< uvec2 > where each element is an index to 2 vertices to draw a line between.
	findIndices( OUTSIDE_EDGE_FLAG );
	
	// Upload new Indices
	glBindBuffer( GL_ELEMENT_ARRAY_BUFFER, m_iIndicesBuffer );
	glBufferData( GL_ELEMENT_ARRAY_BUFFER, m_vDrawIndices.size() * sizeof( uvec2 ), m_vDrawIndices.data(), GL_DYNAMIC_DRAW );

	// Set Uniform - Since glLineWidth is depreciated, 
	// I am setting up a geometry shader to create a line mesh using this uniform.  
	// I'm including this since I don't know what is wrong with my code.
	ShaderManager::getInstance()->setUniformFloat( ShaderManager::eShaderType::EDGE_SHDR, "fWidth", 1000.f );

	glPointSize( 10.0 );
	//glDrawArrays( GL_POINTS, 0, m_vEdgeIndices.size() );
	glDrawElements( GL_POINTS, m_vDrawIndices.size() * 2, GL_UNSIGNED_INT, NULL );
	glPointSize( 1.0 );

	// Restore Bindings
	glUseProgram( iCurrProgBinding );
	glBindVertexArray( iCurrVABinding );
}

This draw function is called after the object draws. I do this in order to draw the lines on top of the object.

This is what I get:
Example 1
Example 2
I wasn’t able to load them into the post for some reason, so I loaded them onto my git repo.

In both examples, the scene is simply a plane with a point light source and the world coordinate lines, as mentioned earlier. In the first example, it shows a point at the very top right hand corner of the window. In the second, the point is gone. The disappearance of the point is due to the Edgebuffer algorithm as when the camera views the top, it considers it a front facing edge, but from the bottom, it’s a back facing edge.

The issue is, I don’t know why it’s drawing a point like that. The point is also white which doesn’t follow with what my fragment shader specifies which is supposed to color the point black.

Here’s what I’ve tried:

[ul]
[li]I thought maybe it was using the plane shader for some reason; I changed the color of the plane in the shader to black, but was still getting the white point.[/li][li]I tried using a different shader for the drawEdgeBuffer function, same issue.[/li][li]I tried using glDrawArrays instead; this drew a point at both corners at the top of the window regardless of camera position.[/li][li]I was initially trying to hijack the vertex array of the object creating the edgebuffer; the plan was to just create my own Index Buffer and if I bound it using the vertex array, I could use my own IB to address the already bound vertices. I rewrote it creating my own vertex array thinking that that could be an issue.[/li][/ul]

My main thought is that it’s drawing it to screen space for some reason, but I don’t see what’s causing the issue since I use the same technique for drawing a bunny mesh and it shows up fine.

Any help or insight would be greatly appreciated, thank you!

I figured it out!

Turns out that in this function:

 ...
	// Set Uniform - Since glLineWidth is depreciated, 
	// I am setting up a geometry shader to create a line mesh using this uniform.  
	// I'm including this since I don't know what is wrong with my code.
	ShaderManager::getInstance()->setUniformFloat( ShaderManager::eShaderType::EDGE_SHDR, "fWidth", 1000.f );
...

I bind the program, but then bind it back to 0 before exiting the function. Because I call this after binding my program, it is unbound when I go to draw. One of those frustrating mistakes!

Thanks!

Hey, great job, I’m glad I’m not the only one who posts, and then a little bit later, posts that he solved his own problem.

Keep it going!

Jeff