glDrawArrays triggers Access Violation

Hi,

after having spent quite some time to understand how OpenGL 3.2 basics work, I now wanted to display a first polygon on the screen.
Therefore I tried to build this tutorial, but I’m having some trouble.

The following code is almost completely taken from the above tutorial. The only differences are, that I’m not using SDL but SFML to create the rendering context, I reordered the commands a bit and I’m using my own shader and program classes to encapsulate the raw GL commands. Additionally I removed the loop at the end, since it has nothing to do with my problem.

    // Define vertices and colors
    const GLfloat diamond[4][2] = {
        {  0.0,  1.0  }, /* Top point */
        {  1.0,  0.0  }, /* Right point */
        {  0.0, -1.0  }, /* Bottom point */
        { -1.0,  0.0  } }; /* Left point */

    const GLfloat colors[4][3] = {
        {  1.0,  0.0,  0.0  }, /* Red */
        {  0.0,  1.0,  0.0  }, /* Green */
        {  0.0,  0.0,  1.0  }, /* Blue */
        {  1.0,  1.0,  1.0  } }; /* White */

        // Bind attributes       
        glBindAttribLocation(TestProgram.GetProgramObject(), 0, "in_Position");
        glBindAttribLocation(TestProgram.GetProgramObject(), 1, "in_Color");

        // Link Program
        TestProgram.Link();
        TestProgram.Use();
               
               
        GLuint vao, vbo[2];
        glGenVertexArrays(1, &vao);
        glBindVertexArray(vao);

        glGenBuffers(2, vbo);

        glBindBuffer(GL_ARRAY_BUFFER, vbo[0]);
        glBufferData(GL_ARRAY_BUFFER, 8 * sizeof(GLfloat), diamond, GL_STATIC_DRAW);
        glVertexAttribPointer((GLuint)0, 2, GL_FLOAT, GL_FALSE, 0, 0);
        glEnableVertexAttribArray(0);

        glBindBuffer(GL_ARRAY_BUFFER, vbo[1]);
        glBufferData(GL_ARRAY_BUFFER, 12 * sizeof(GLfloat), colors, GL_STATIC_DRAW);
        glVertexAttribPointer((GLuint)1, 3, GL_FLOAT, GL_FALSE, 0, 0);
        glEnableVertexAttribArray(1);

        while (App.IsOpened()) {
            sf::Event Event;

            // Handle events
            while (App.GetEvent(Event)) {
                /* ... */
            }
           
            glClearColor(0.0, 0.0, 0.0, 1.0);
            glClear(GL_COLOR_BUFFER_BIT);

            glDrawArrays(GL_LINE_LOOP, 0, 4); // <-- Here, the problem arises
           
            App.Display();
        }

The used shaders are taken almost completely from the tutorial.

As the threads title suggests, the problem is an Access Violation during runtime during the glDrawArrays call.
Reducing the count parameter of glDrawArrays(GLenum mode, GLint first, GLsizei count) to 1, solves the problem, so that no Access Violation arises. However nothing is drawn, of course.

Shaders and programs are compiled and linked successfully.

I hope someone here can help me, so that I eventually see a first polygon rendered. =)

Thanks and greetings,
OGLBeginner

A quick guess is that position arrays with size < 3 or > 4 are not supported. Try changing your line positions to be 3 dimensional, with the Z dimension just being 0 or -1.

In fact, you may be better off using 4-dimensions for positions anyway, just set the 4th dimension to 1.0 as well. If it’s fixed functionality you can be sure that the fixed vertex shader is requesting a vec4 for position. Same with colours, those should be 4-dimensional with the alpha set to 1.0.

It’s not always best to pack your data as such. :slight_smile: There are benefits to using 4-dimensional floats for everything. I am not 100% on the specifics, but it has to do with how SIMD instructions are optimized. I’m not 100% sure also if that really applied to what you’re doing either but it’s good practice.

Are you using custom GLSL shaders?

It worked! :smiley:
However, I don’t quite understand why it didn’t before…

Are you using custom GLSL shaders?

I’m basically using the shaders provided by the tutorial.

Here’s the vertex shader

//#version 150
// in_Position was bound to attribute index 0 and in_Color was bound to attribute index 1
in vec4 in_Position; //<-- I just changed that from vec2 to vec4  to match the incoming data
in vec3 in_Color;

// We output the ex_Color variable to the next shader in the chain
out vec3 ex_Color;
void main(void) {
    // Since we are using flat lines, our input only had two points: x and y.
    // Set the Z coordinate to 0 and W coordinate to 1

    //gl_Position = vec4(in_Position.x, in_Position.y, 0.0, 1.0);
	gl_Position = in_Position;
    // GLSL allows shorthand use of vectors too, the following is also valid:
    // gl_Position = vec4(in_Position, 0.0, 1.0);
    // We're simply passing the color through unmodified

    ex_Color = in_Color;
}

I guess I’ll have to experiment a bit, why it didn’t work before.

Anyways, thanks for your help, finally a first line loop! xD
As I said, I was basically using the tutorial code, so if you find some problems there, I’d be glad if you let me know.

Greetings,
OGLBeginner

A quick guess is that position arrays with size < 3 or > 4 are not supported.

That makes no sense. OpenGL neither knows nor cares whether something is a “position array.” It’s just an attribute. So if that fixed it, then there’s a driver bug.

What hardware are you using?

Good question. I looked it over quickly and didn’t see that my questions were already answered in your code. :wink: Glad that switching it to a vec4 works now, be interesting to figure out why it wasn’t before.

What hardware are you using?

AMD Phenom II X4 920
ATI HD4870 1GB

My Catalyst Version is 10.3.

Anyways, I experimented a bit and had some interesting results:

  1. It doesn’t matter if shaders and programs are even used. I observed that, when I had a small typo in the vertex shader resulting in no code being generated (at least the info log said so), a picture was still being drawn.

  2. Apparently the shaders don’t compile, because of the #version directive. Commenting this directive (which I did before) makes the code compile, but of course not in the correct version, but in version 1.1. Could that be the source of the problem?

This leads to the question why it doesn’t compile with the “#version 150” directive. The shader info log says:
“Vertex shader failed to compile with the following errors:
ERROR: 1:1: error<#76> Synatx error unexpected tokens following #version
ERROR: error<#273> 1 compilation errors. No code generated”

The same of course for the fragment shader.

Any idea why this happens?

Greetings,
OGLBeginner

AFAIK, if it still draws when the shaders are disabled, it means its falling back on the Fixed-Functionality pipeline…

Thanks, this explains observation 1.
Still, the question is why the “#version 150” directive prevents compilation…

Greetings,
OGLBeinner

I guess one cannot rely on a shaders info log…
I just tried “#version 123” (instead of “#version 150”) and the info log suddenly says: Version number not supported by GL2.

Isn’t that odd? To me, “unexpected tokens following #version” (the error before) sounds like the compiler couldn’t read the version number for some reason, but it appears it actually can. Additionally, why would it say GL2? I’m sure, that the context I’m using is set up for OpenGL 3.2, at least querying GL_MAJOR_VERSION and GL_MINOR_VERSION say so.

Is there any way to check which version of the GLSL is really used inside the shader? Is there a way to write to the shaders info log, from inside the shader (to write VERSION )?

Greetings,
OGLBeginner

Is there any way to check which version of the GLSL is really used inside the shader?

That’s not how it works. Each GLSL version has differences that make some of them incompatible. For example, your shader is a GLSL 1.5 shader and uses 1.5 features that a proper GLSL 1.1 compiler cannot compile. The version setting is how you tell GLSL what version of GLSL the shader is written with. If you don’t specify a version number, I believe that 1.1 is used.

As for why the “#version 150” isn’t working, maybe it’s the comments after it.

That’s interesting. I removed the comments after #version (and all others) and now there is a different compilation error.
First the current vertex shader code:

#version 150

in vec2 in_Position;
in vec3 in_Color;

out vec3 ex_Color;

void main(void) {
    gl_Position = vec4(in_Position.x, in_Position.y, 0.0, 1.0);
    ex_Color = in_Color;
}

The error:
error<#307> Profile ‘in’ is not supported
error<#76> Syntax error unexpected tokens following #version
error<#273> 2 compilation errors. No code generated.

It just seems to skip the version number…
What’s with that?

I believe that 1.1 is used.

That’s what it says in the spec.

Greetings,
OGLBeginner

Do you query the named attribute to get the index target for the attribute bind?

You are supposed to bind these attributes to the index slot from the glBindAttribLocation for any given shader and I see no evidence you are making this call.

It just seems to skip the version number…

It’s not skipping the version number. It’s really, really confused with the parsing. It thinks that “in” is somehow part of your version.

Are you somehow stripping out the spaces or endlines when you load the file?

I see no evidence you are making this call.

   // Bind attributes       
    glBindAttribLocation(TestProgram.GetProgramObject(), 0, "in_Position");
    glBindAttribLocation(TestProgram.GetProgramObject(), 1, "in_Color");

I’m doing that. :wink:

Are you somehow stripping out the spaces or endlines when you load the file?

I guess that’s the problem. Yes, I’m stripping out the endlines (but not the spaces), but the spec says, that only a series of null terminated strings is needed and according to the c++ reference string.c_str() points to a null terminated string.
Anyways, I’m using the following code to load the shaders.

	// Create buffer of null-terminated strings
	vector<string> LineBuffer;
	while (File) {
		string Line;
		getline(File, Line);
		LineBuffer.push_back(Line);
	}
	if (File.bad())
		throw runtime_error("Couldn't read shader");

		
	vector<const GLchar*> CArray;
	vector<string>::const_iterator end(LineBuffer.end()), begin(LineBuffer.begin());
	while (begin != end) {
		CArray.push_back((begin++)->c_str());
	}

	// Set shader source
	glShaderSource(ShaderObject, CArray.size(), &CArray[0], 0);

When I’m reading the source of the shader with glGetShaderSource it actually prints “#version 150in”, so I guess that’s really the problem.

Thanks for all the help. If you see the problem with my code to load the shader source, please tell me.
I’ll also try to solve it.

Greetings,
OGLBeginner

PS: If you wonder why I’m using so much code, the answer is that it should be exception safe and not leak resources if and exception is raised.

Anyways, I’m using the following code to load the shaders.

Wow, there are so many things wrong with that. Just do this:


std::ifstream shaderFile(strFilename.c_str());
std::stringstream shaderData;
shaderData << shaderFile.rdbuf();
shaderFile.close();
const std::string &theShader = shaderData.str();

Where “strFilename” is the filename as a std::string. The shader string is in the file named “theSahder”.

Each of the null-terminated strings is not meant to be lines of a single file; each null-terminated string is supposed to be a file. So you would just pass an array of one.

If you use Mercurial, I’m trying to develop some tutorials on OpenGL. The base code has functions for loading and compiling/linking shaders. You can get the repository here.

each null-terminated string is supposed to be a file.

Yeah, I just figured. I never really looked at the tutorial code, for loading the shader source.

Each of the null-terminated strings is not meant to be lines of a single file; each null-terminated string is supposed to be a file. So you would just pass an array of one.

The spec says actually something different:

The command
void ShaderSource( uint shader, sizei count, const char **string, const int *length );
loads source code into the shader object named shader. string is an array of count pointers to optionally null-terminated character strings that make up the source code. The length argument is an array with the number of chars in each string (the string length). If an element in length is negative, its accompanying string is nullterminated.
If length is NULL, all strings in the string argument are considered nullterminated.

Since there is so little material on OpenGL 3.2 (compared to the previous version) I tried to go by the spec. And for me this bold part reads like:
glShaderSource(ShaderObject, CArray.size(), &CArray[0], 0); :wink:

Anyways, I’ll try your approach and hopefully this will finally solve the problem.
Thanks for your help, I’ll have a look at your tutorial code!

Greetings,
OGLBeginner

Since there is so little material on OpenGL 3.2 (compared to the previous version)

Shaders have been in OpenGL since before 2.0; that’s a good 5 years or so. The various versions of OpenGL since then have changed nothing about shader loading. There is plenty of information about them. And I’ve never seen loading code like yours.

And for me this bold part reads like:
glShaderSource(ShaderObject, CArray.size(), &CArray[0], 0);

How? It specifically says an array of pointers to strings. Not an array of pointers to characters.

How? It specifically says an array of pointers to strings. Not an array of pointers to characters.

It says “character strings”, which would be something like
GLchar string1 = {‘S’, ‘T’, ‘R’, ‘1’, ‘\0’};

And an array of count pointers to these strings, would be what I did, wouldn’t it?

Shaders have been in OpenGL since before 2.0; that’s a good 5 years or so.

I wasn’t specifically talking about shaders, but also things like vertex buffers, vertex array objects, ect…
There is tons of stuff about the “old stuff” and even the new Red Book still covers fixed functionality.

And I’ve never seen loading code like yours.

Thanks, I guess. xD

Greetings,
OGLBeginner

It says “character strings”, which would be something like
GLchar string1 = {‘S’, ‘T’, ‘R’, ‘1’, ‘\0’};

Which is the same as:

const GLchar string1[] ="STR1";

In short, it is a string. Just like any other string.

And an array of count pointers to these strings, would be what I did, wouldn’t it?

No, it wouldn’t. What you gave it is an array of pointers to each character of the same string. What it asked for was an array of pointers to different strings.

An array of pointers to strings (more commonly called an array of strings), would look like this:


const char *strArray[] =
{
  "first string",
  "second string",
  "third string",
};

Or like this:


std::string first("first string");
std::string second("second string");
std::string third("third string");

const char *strArray[3];
strArray[0] = first.c_str();
strArray[1] = second.c_str();
strArray[2] = third.c_str();

Or even this:


const char **strArray = new (char*)[3];
strArray[0] = "first string";
strArray[1] = "second string";
strArray[2] = "third string";

You only had one string; the string you loaded from the file. Thus, your array should have been an array of one element in length. “count” is the number of strings in the array, not the number of characters in a string.

You only had one string; the string you loaded from the file. Thus, your array should have been an array of one element in length. “count” is the number of strings in the array, not the number of characters in a string.

Now I’m confused.

Have a look at my code again:

	vector<string> LineBuffer;
	while (File) {
		string Line;
		getline(File, Line);
		LineBuffer.push_back(Line);
	}

LineBuffer is a vector of strings, each of which represents a line in the sourcefile (getline(File, Line);).

	vector<const GLchar*> CArray;
	vector<string>::const_iterator end(LineBuffer.end()), begin(LineBuffer.begin());
	while (begin != end) {
		CArray.push_back((begin++)->c_str());
	}

Here if feed a vector of pointers to const GLchars with the c_str() of LineBuffer. I’m doing this only because the glShaderSource command doesn’t work with std::strings.

	glShaderSource(ShaderObject, CArray.size(), &CArray[0], 0);

And finally here, CArray.size() is the number of lines in the sourcefile and &CArray[0] returns the address of an array of CArray.size() pointers to null-terminated c-strings. Or, as it is said in the spec:
“string is an array of count pointers to optionally null-terminated character strings that make up the source code.”

But maybe I’m not seeing the wood for the trees.

Greetings,
OGLBeginner