Problem getting VAOs to work

I created a 3.0 context, and I’m trying to set up VAOs in the place of my old VBOs, but I get some memory error, so I guess I’m doing something wrong. Here’s my new code:

        Gl.glGenVertexArrays(1, out VAOadress);
        Gl.glBindVertexArray(VAOadress);

        Gl.glGenBuffers(1, out VBOadress);
        Gl.glBindBuffer(Gl.GL_ARRAY_BUFFER, VBOadress);

        Gl.glBufferData(Gl.GL_ARRAY_BUFFER, (IntPtr)getSize() , null, Gl.GL_STATIC_DRAW);
        (inserts data into VBO here)
        Gl.glVertexPointer(3, Gl.GL_FLOAT, 64, (IntPtr)0);
        Gl.glColorPointer(4, Gl.GL_FLOAT, 64, (IntPtr)48);            Gl.glEnableVertexAttribArray(0);

        Gl.glBindVertexArray(0);

and then in the rendering function:

Gl.glBindVertexArray(VAOadress);
Gl.glDrawArrays(Gl.GL_TRIANGLES, 0, 3);

        Gl.glBindVertexArray(0);

I get this memory error on glDrawArrays during rendering, so I suppose I’m doing something wrong with setting up vertex pointers? Thanks in advance;)

(edit: I just wanted to add, that the pointers worked correctly in 2.1 context and with IVBOs)

I’m sorry for the double post, but I managed to get past the memory error. Switching glVertexPointer and glColorPointer to VertexAttribPointer seems to have done the trick. I still can’t get any rendering done, but at least I don’t get a memory error now. Id still appreciate any help, since I have no idea whats wrong:

Here’s my viewport setting up function:


Gl.glClear(Gl.GL_COLOR_BUFFER_BIT | Gl.GL_DEPTH_BUFFER_BIT);
Gl.glViewport(0, 0, 800, 600);

Here’s my shader setting up function. I cut parts dealing with glsl compiling and linking errors. Compiling and linking seems to work all right(I get correct errors, when I mess my shaders up, and no error when I don’t), but I’m not sure about the attribute location binding.


programObject = Gl.glCreateProgramObjectARB();
vertexShader = Gl.glCreateShaderObjectARB(Gl.GL_VERTEX_SHADER_ARB);
fragmentShader = Gl.glCreateShaderObjectARB(Gl.GL_FRAGMENT_SHADER_ARB);

loadVertexShader();
loadFragmentShader(); //Those functions load vertex and fragment shader sources into vertexShaderSource and fragmentShaderSource

Gl.glShaderSourceARB(vertexShader, 1, new string[1] { vertexShaderSource }, new int[1] { vertexShaderSource.Length });
Gl.glShaderSourceARB(fragmentShader, 1, new string[1] { fragmentShaderSource }, new int[1] { fragmentShaderSource.Length });

Gl.glCompileShaderARB(vertexShader);
Gl.glCompileShaderARB(fragmentShader);

Gl.glAttachObjectARB(programObject, vertexShader);
Gl.glAttachObjectARB(programObject, fragmentShader);

Gl.glDeleteObjectARB(vertexShader);
Gl.glDeleteObjectARB(fragmentShader);

Gl.glBindAttribLocation(programObject, 0, "Position");
Gl.glBindAttribLocation(programObject, 1, "Color");

Gl.glLinkProgramARB(programObject);

Those are my shaders. Vertex shader:


in vec3 Position;
in vec4 Color;
out vec4 color;

void main() {
vec4 pos=vec4(Position.xyz,1.0);
gl_Position=pos+vec4(0.0,0.0,-5.0,0.0);
color=Color;
}

Fragment shader:


in vec4 color;
out vec4 fragment_color;

void main() { 
fragment_color=color;
}

Here’s my data setting function. Again, the data is correct, since it worked in 2.1, but I’m not sure about attribute pointers. I simply changed Gl.glVertexPointer and Gl.glColorPointer to Gl.glVertexAttribPointer with corresponding indices - 0 and 1(Position and Color attributes passed to shaders are of those indices).


Gl.glGenVertexArrays(1, out VAOadress);
Gl.glBindVertexArray(VAOadress);

Gl.glGenBuffers(1, out VBOadress);
Gl.glBindBuffer(Gl.GL_ARRAY_BUFFER, VBOadress);

Gl.glBufferData(Gl.GL_ARRAY_BUFFER, (IntPtr)getSize() , null, Gl.GL_STATIC_DRAW);
Gl.glBufferSubData(Gl.GL_ARRAY_BUFFER, (IntPtr)0, (IntPtr)64, new float[16] { 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0 });
Gl.glBufferSubData(Gl.GL_ARRAY_BUFFER, (IntPtr)64, (IntPtr)64, new float[16] { 1, -1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0 });
Gl.glBufferSubData(Gl.GL_ARRAY_BUFFER, (IntPtr)128, (IntPtr)64, new float[16] { -1, -1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0 });

Gl.glEnableVertexAttribArray(0);
Gl.glVertexAttribPointer(0,3, Gl.GL_FLOAT,false, 64, (IntPtr)0);

Gl.glEnableVertexAttribArray(1);
Gl.glVertexAttribPointer(1, 4, Gl.GL_FLOAT, false, 64, (IntPtr)48);

Gl.glBindVertexArray(0);

And last, but not least(:P) my rendering function:


Gl.glUseProgramObjectARB(programObject);
Gl.glClear(Gl.GL_COLOR_BUFFER_BIT | Gl.GL_DEPTH_BUFFER_BIT);

Gl.glBindVertexArray(VAOadress);
Gl.glEnableVertexAttribArray(0);
Gl.glEnableVertexAttribArray(1);

Gl.glDrawArrays(Gl.GL_TRIANGLES, 0, 3);

Gl.glDisableVertexAttribArray(0);
Gl.glDisableVertexAttribArray(1);
Gl.glBindVertexArray(0);

Gl.glUseProgramObjectARB(0);

VAOs are not an alternative to VBOs.

While both deal with vertex attribute and index arrays…
VBOs capture the contents.
VAOs capture the bindings and enables (IIRC).

I’m sorry, it was my poor(wrong:P) choice of words. As you can see in the code I provided, I’m not really trying to put VAO instead of VBO, but use them as an addition to VBOs in a way I saw it used in several tutorials.

Hm, I don’t see what’s wrong in your code…
Do you see the black screen? Try changing the clear color, maybe.

Also, you don’t need to call enable/disable for attributes often. Once you first binded vertex attributes and enabled them - they will stay enabled in VAO, so you can clean this code from the drawing routine.

Does it work using only VBO? If not your problem is not related to VAO and something is wrong in the data, offsets,… you have given to OpenGL.

@DmitryM - Changing clear color changes screens color, but I still can’t get anything to render.
I changed my rendering fuction to not include those enables/disables, but nothing changed.

@dletozeun - I tried doing it without VAOs, and still can’t get it to work, so it’s probably a problem with attribute pointers or shaders. Shaders compile and link correctly however. Is there a way to check what data they recieve? Some feedback or something? I had it working in 2.1(slightly different code, but the same meaning), but under 3.0-3.1 I simply cannot. I began to think it’s because Tao doesn’t support 3.0-3.1(I modified it myself to create bindings and contexts, but I could have done it wrong), so I ported the project to OpenTK, but I still can’t get anything to render, so the error probably lies somewhere in my code.

There are examples of GL-3 rendering in OpenTK (search their site). They should be similar to what you are doing.
Assuming they work correctly you can find the difference that contains a mistake.

Thanks for help everybody, I finally found the reason. Once I changed
“vec4 pos=vec4(Position.xyz,1.0);” to
“vec4 pos=vec4(Position,1.0);”
in my vertex shader it turned out the rest works correctly. I took this line from Zeoverlords tutorial (http://www.flashbang.se/index.php?id=62), and have no idea why it doesn’t work.
I Also noticed it doesn’t render vertices if their z coordinate is >1, or <-1, but this I guess is normal without matrices.

Well sure. Your vertex shader code:

vec4 pos=vec4(Position.xyz,1.0);
gl_Position=pos+vec4(0.0,0.0,-5.0,0.0);

gl_Position is clip-space position. Divide by w and you get NDC. Anything not in -1…1 in X, Y, and Z in NDC will be clipped. See http://www.songho.ca/opengl/gl_transform.html.

In your case, clip-space == NDC space, because your w == 1 (i.e. a no-op divide-by-w).

Bottom line, with your w==1, any interpolated gl_Position not in the -1…1 cube will be clipped. That means your gl_Position must be in (-1,-1,-1)…(1,1,1).

With the add of -5 units Z in your shader, that means any pos not in (-1,-1,4)…(1,1,6) will be clipped.

Does this explain the behavior you’re seeing? Your change of Position.xyz to Position shouldn’t have made any difference.

Thanks, this definitely explains the -1 to 1 z limit.
Position.xyz to Position still changes a lot though. I get no error, but with correct coordinates, I get nothing rendered with ‘.xyz’, and everything correct without it(I just double checked it now).

I’m using ATI Radeon 3870 with Catalyst 9.9 drivers.
I also noticed, that while it compiles shaders with #version 140, it still requires me to set precision although I read that it shouldn’t.