Shaders and vertex arrays

I’m trying hard to figure out shaders, and I’ve been reading this tutorial, which seems pretty good and easy to understand. I’ve followed the code in it as closely as I can, and while I do see a triangle, it’s all a solid red, rather than the range of colors I specified in my color array. Here’s my actual code:
In the init section:

            float[] vertices = {
                                -0.8f, -0.8f, 0.0f, 1.0f,
                                0.0f, 0.8f, 0.0f, 1.0f,
                                0.8f, -0.8f, 0.0f, 1.0f
                                };

            float[] colors = {
                                1.0f, 0.0f, 0.0f, 1.0f,
                                0.0f, 1.0f, 0.0f, 1.0f,
                                0.0f, 0.0f, 1.0f, 1.0f
                             };

            gl.GenVertexArrays(1, vaoId);
            gl.BindVertexArray(vaoId[0]);

            gl.GenBuffers(1, vertexBufferId);
            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, vertexBufferId[0]);
            unsafe
            {
                fixed (float* verts = vertices)
                {
                    var ptr = new IntPtr(verts);
                    gl.BufferData(OpenGL.GL_ARRAY_BUFFER, vertices.Length * sizeof(float), ptr, OpenGL.GL_STATIC_DRAW);
                }
            }

            gl.VertexAttribPointer(0, 3, OpenGL.GL_FLOAT, false, 0, new IntPtr(0));
            gl.EnableVertexAttribArray(0);

            gl.GenBuffers(1, colorBufferId);
            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, colorBufferId[0]);
            unsafe
            {
                fixed (float* colr = colors)
                {
                    var ptr = new IntPtr(colr);
                    gl.BufferData(OpenGL.GL_ARRAY_BUFFER, colors.Length * sizeof(float), ptr, OpenGL.GL_STATIC_DRAW);
                }
            }

            gl.VertexAttribPointer(1, 3, OpenGL.GL_FLOAT, false, 0, new IntPtr(0));
            gl.EnableVertexAttribArray(1);

            
            vertexShaderId = gl.CreateShader(OpenGL.GL_VERTEX_SHADER);
            fragmentShaderId = gl.CreateShader(OpenGL.GL_FRAGMENT_SHADER);

            string[] vertString = File.ReadAllLines("vertShader.vert");
            string[] fragString = File.ReadAllLines("fragShader.frag");
            unsafe
            {
                gl.ShaderSource(vertexShaderId, 1, vertString, null);
                gl.ShaderSource(fragmentShaderId, 1, fragString, null);
            }

            gl.CompileShader(vertexShaderId);
            gl.CompileShader(fragmentShaderId);

            programId = gl.CreateProgram();

            gl.AttachShader(programId, vertexShaderId);
            gl.AttachShader(programId, fragmentShaderId);

            gl.LinkProgram(programId);
            gl.UseProgram(programId);
            

            gl.ClearColor(0, 0, 0, 0);

And in the Draw section:

            gl.Clear(OpenGL.GL_COLOR_BUFFER_BIT | OpenGL.GL_DEPTH_BUFFER_BIT);

            //  Load the identity matrix.
            gl.LoadIdentity();

            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, vertexBufferId[0]);

            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, colorBufferId[0]);

            gl.DrawArrays(OpenGL.GL_TRIANGLES, 0, 3);

This is vertShader.vert:

#version 400

layout(location=0) in vec4 in_Position;
layout(location=1) in vec4 in_Color;
out vec4 ex_Color;

void main(void)
{
	gl_Position = in_Position;
	ex_Color = in_Color;
}

and fragShader.frag:

#version 400

in vec4 ex_Color;
out vec4 out_Color;

void main(void)
{
	out_Color = ex_Color;
}

I apologize for the C#-ness of the code (I’m using SharpGL as a wrapper), but it’s still understandable. As far as I can tell I’m passing everything correctly, but I’ve never done this before and I wouldn’t know if I’m making some elemental error. What’s wrong with my setup?

http://www.opengl.org/wiki/Common_Mistakes#Checking_For_Errors_When_You_Compile_Your_Shader

and also, you should set glShadeModel to GL_SMOOTH.

And besides that, your render function doesn’t make any sense. You are suppose to use VAO.
This is not good


gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, vertexBufferId[0]);

            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, colorBufferId[0]);

            gl.DrawArrays(OpenGL.GL_TRIANGLES, 0, 3);

This is the way you bind a VAO and render. Don’t bind VBOs.


gl.BindVertexArray(vaoId[0]);
gl.DrawArrays(OpenGL.GL_TRIANGLES, 0, 3);

Thanks for the link. I added code to check for errors after compiling shaders:

uint errorCheckValue = gl.GetError();
            if (errorCheckValue != OpenGL.GL_NO_ERROR)
            {
                Application.Exit();  // Got a bit lazy there ;)  but it gets the message across.
            }

The application stayed open, so there doesn’t seem to be a problem compiling the shaders.

The tutorial I was looking at binds both VBOs and the VAO. Why does it do that if it needs to just bind the VAO?

The initialize section is run once before the draw section, and the vao is bound there and not unbound anywhere else, so I wouldn’t expect that to be a problem, but I bound it in the Draw function as well, to make sure. Unfortunately it didn’t seem to make a difference. However, when I unbound the VBOs by putting gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, 0); right before binding the VAO again in the Draw function, I got a white screen and the application froze until I stopped debugging.

However, I think I’ve found the problem :), though I’m not sure how to solve it :(. Even though the VAO is bound after the VBOs with gl.BindVertexArray(vaoId); and I have the two gl.VertexAttribPointer calls and two gl.EnableVertexAttribArray calls right between that and glDrawArrays, I get the white screen and freeze if no VBO is bound.

If the VBO with vertex coordinates is bound after the one with color values, though, I get a solid red triangle with vertex locations the same as in the vertices array. If the VBO with color coordinates is bound last, though, I get a solid red triangle with vertex locations the same as in the colors array. I think it’s using the last bound VBO for vertex locations, regardless of the glVertexAttribPointer and glEnableVertexAttribArray calls. It might even be ignoring my shaders, though I don’t know how to check that.

How can I get it into the program’s head that the colors array is for colors and the vertices array is for vertex positions?

EDIT - I read a bit more about glVertexAttribPointer and fixed an issue where the size parameter was 3 when it should be 4. Still, I’m having the same problem of it reading the last loaded VBO’s data as position values. :frowning:

Shaders failing to compile doesn’t cause an OpenGL error (that FAQ isn’t helpful) - you need to check the compile status using:

GLint status;
glGetShaderiv(shader, GL_COMPILE_STATUS, &status);

And the program link status using:

GLint status;
glGetProgramiv(program, GL_LINK_STATUS, &status);

Thanks Dan. I’ve replaced the code in error with

int[] compileStatus = new int[1];
            gl.GetShader(vertexShaderId, OpenGL.GL_COMPILE_STATUS, compileStatus);
            if (compileStatus[0] != OpenGL.GL_TRUE)
            {
                Application.Exit();
            }

for each of the two shaders right after their compile calls, and a similar version for the link call. The application still doesn’t close, so I assume there’s no error there.

EDIT - I removed the glVertexAttribPointer and glEnableVertexAttribArray calls in the Draw function. The problem was these caused OpenGL to look for position andcolor data in the last bound buffer, which is why the triangle changed depending which buffer was bound, and why it froze when no buffer was bound.

However, it’s still a solid red. I’m suspecting something’s wrong with my fragment shader. Is it outputting the right variable? How does the fragment shader know what to do with out_Color? Do I have to bind out_Color somehow?

My init code is now:

            gl.ShadeModel(OpenGL.GL_SMOOTH);

            float[] vertices = {
                                -0.8f, -0.8f, 0.0f, 1.0f,
                                0.0f, 0.8f, 0.0f, 1.0f,
                                0.8f, -0.8f, 0.0f, 1.0f
                                };

            float[] colors = {
                                1.0f, 0.0f, 0.0f, 1.0f,
                                0.0f, 1.0f, 0.0f, 1.0f,
                                0.0f, 0.0f, 1.0f, 1.0f
                             };

            gl.GenVertexArrays(1, vaoId);
            gl.BindVertexArray(vaoId[0]);

            gl.GenBuffers(1, vertexBufferId);
            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, vertexBufferId[0]);
            unsafe
            {
                fixed (float* verts = vertices)
                {
                    var ptr = new IntPtr(verts);
                    gl.BufferData(OpenGL.GL_ARRAY_BUFFER, vertices.Length * sizeof(float), ptr, OpenGL.GL_STATIC_DRAW);
                }
            }

            gl.VertexAttribPointer(0, 4, OpenGL.GL_FLOAT, false, 0, new IntPtr(0));
            gl.EnableVertexAttribArray(0);

            gl.GenBuffers(1, colorBufferId);
            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, colorBufferId[0]);
            unsafe
            {
                fixed (float* colr = colors)
                {
                    var ptr = new IntPtr(colr);
                    gl.BufferData(OpenGL.GL_ARRAY_BUFFER, colors.Length * sizeof(float), ptr, OpenGL.GL_STATIC_DRAW);
                }
            }

            gl.VertexAttribPointer(1, 4, OpenGL.GL_FLOAT, false, 0, new IntPtr(0));
            gl.EnableVertexAttribArray(1);

            
            vertexShaderId = gl.CreateShader(OpenGL.GL_VERTEX_SHADER);
            fragmentShaderId = gl.CreateShader(OpenGL.GL_FRAGMENT_SHADER);

            string[] vertString = File.ReadAllLines("vertShader.vert");
            string[] fragString = File.ReadAllLines("fragShader.frag");
            unsafe
            {
                gl.ShaderSource(vertexShaderId, vertString.Length, vertString, null);
                gl.ShaderSource(fragmentShaderId, fragString.Length, fragString, null);
            }

            gl.CompileShader(vertexShaderId);

            int[] compileStatus = new int[1];
            gl.GetShader(vertexShaderId, OpenGL.GL_COMPILE_STATUS, compileStatus);
            if (compileStatus[0] != OpenGL.GL_TRUE)
            {
                Application.Exit();
            }

            gl.CompileShader(fragmentShaderId);

            gl.GetShader(vertexShaderId, OpenGL.GL_COMPILE_STATUS, compileStatus);
            if (compileStatus[0] != OpenGL.GL_TRUE)
            {
                Application.Exit();
            }
            

            programId = gl.CreateProgram();

            gl.AttachShader(programId, vertexShaderId);
            gl.AttachShader(programId, fragmentShaderId);

            gl.LinkProgram(programId);
            int[] linkStatus = new int[1];
            gl.GetProgram(programId, OpenGL.GL_LINK_STATUS, linkStatus);
            if (linkStatus[0] != OpenGL.GL_TRUE)
            {
                Application.Exit();
            }

            gl.UseProgram(programId);

My Draw code is simply:

gl.Clear(OpenGL.GL_COLOR_BUFFER_BIT | OpenGL.GL_DEPTH_BUFFER_BIT);
gl.LoadIdentity();
gl.BindVertexArray(vaoId[0]);
gl.DrawArrays(OpenGL.GL_TRIANGLES, 0, 3);

And my vertex shader is:

#version 400

layout(location=0) in vec4 in_Position;
layout(location=1) in vec4 in_Color;
out vec4 ex_Color;

void main(void)
{
	gl_Position = in_Position;
	ex_Color = in_Color;
}

And fragment shader:

#version 400

in vec4 ex_Color;
out vec4 out_Color;

void main(void)
{
	out_Color = ex_Color;
}

Im also learning opengl but I can see some thing missing in the the above code.

In your vertex shader you are expecting

->layout(location=0) in vec4 in_Position;
->layout(location=1) in vec4 in_Color;

but in the code you did only glEnableVertexAttribArray(0);

I guess you should also enable glEnableVertexAttribArray(1); and tell Opengl where and how to read the color data. I mean you need to call glVertexAttribPointer with proper values.so that the color information for the vertex is also passed correctly to the vertex shader and then to the fragment shader.

I guess this is the reason why your shape is not in the color you wanted.

good luck!

I do enable the vertex attrib, though, right after I buffer the color data:

gl.GenBuffers(1, colorBufferId);
            gl.BindBuffer(OpenGL.GL_ARRAY_BUFFER, colorBufferId[0]);
            unsafe
            {
                fixed (float* colr = colors)
                {
                    var ptr = new IntPtr(colr);
                    gl.BufferData(OpenGL.GL_ARRAY_BUFFER, colors.Length * sizeof(float), ptr, OpenGL.GL_STATIC_DRAW);
                }
            }

            gl.VertexAttribPointer(1, 4, OpenGL.GL_FLOAT, false, 0, new IntPtr(0));
            gl.EnableVertexAttribArray(1);

I tried switching to GLSL version 1.30 and using gl_FragColor, but still nothing happened. Am I missing some kind of GLSL driver?

You can leave array 1 disabled. Your in_Color will get some default value, probably 0, 0, 0, 1.
As for your vertices, you seem to be using xyzw but you passed a value of 3 to glVertexAttribPointer.

[QUOTE=V-man;1240305]You can leave array 1 disabled. Your in_Color will get some default value, probably 0, 0, 0, 1.
As for your vertices, you seem to be using xyzw but you passed a value of 3 to glVertexAttribPointer.[/QUOTE]

hmmmmm… this is interesting. Thanks V-man.

I don’t see where I’m passing 3 to glVertexAttribPointer, both of those calls right after buffering VBO data have 4 passed as the size.

Anyway, the vertex positions seem to be working properly. It’s the color that isn’t working. I’m using rgba, and passing a size of 4, but the shape is just all a solid red, no matter the value of out_Color in the fragment shader. I set out_Color equal to a solid blue once, just to test, and still got red. There’s something wrong with my shader, or maybe the way I pass data to it, but for the life of me I can’t figure out what.