Trouble with shader

I’m having some problems getting my shader to work and I seek guidance, please.

When I run the code below, the triangles that I output is white and not the color specified in verts at the end of the second code block.

Any help is appreciated.

Also, here is the result from running GL_SHADING_LANGUAGE
Intel
Intel(R) HD Graphics Family
3.1.0 - Build 8.15.10.2476
1.40 - Intel Build 8.15.10.2476

This is my shader code.

const char* vertexShaderCode =
#version 140
"
“”
“in layout(location=0) vec2 position;”
“in layout(location=1) vec3 vertexColor;”
“out vec3 theColor;”
“”
“void main()”
“{”
" gl_Position = vec4(position, 0.0, 1.0);”
" theColor = vertexColor;"
“}” ;

const char* fragmentShaderCode =
#version 140
"
“”
“out vec4 daColor;”
“in vec3 theColor;”
“”
“void main()”
“{”
" daColor = vec4(theColor, 1.0);”
“}”;

And here is the code that does to compiling, linking, and such

void installShaders()
{
GLuint vertexShaderID = glCreateShader(GL_VERTEX_SHADER);
GLuint fragmentShaderID = glCreateShader(GL_FRAGMENT_SHADER);

const char* adapter[1];
adapter[0] = vertexShaderCode;
glShaderSource(vertexShaderID,1,adapter, NULL);
adapter[0] = fragmentShaderCode;
glShaderSource(fragmentShaderID, 1, adapter, NULL);

glCompileShader(vertexShaderID);
glCompileShader(fragmentShaderID);

GLuint programID = glCreateProgram();
glAttachShader(programID, vertexShaderID);
glAttachShader(programID, fragmentShaderID);

glLinkProgram(programID);

glUseProgram(programID);
std::cout << fragmentShaderCode << std::endl;
std::cout << fragmentShaderID << std::endl;
std::cout << programID << std::endl;
}

void sendDataToOpenGL()
{
GLfloat verts =
{
+0.0f, +0.0f, // 0
+1.0f, +0.0f, +0.0f,
+1.0f, +1.0f, // 1
+1.0f, +0.0f, +0.0f,
-1.0f, +1.0f, // 2
+1.0f, +0.0f, +0.0f,
-1.0f, -1.0f, // 3
+1.0f, +0.0f, +0.0f,
+1.0f, -1.0f, // 4
+1.0f, +0.0f, +0.0f,
};

GLuint vertexBufferID;
glGenBuffers(1, &vertexBufferID);
glBindBuffer(GL_ARRAY_BUFFER, vertexBufferID);
glBufferData(GL_ARRAY_BUFFER, sizeof(verts), verts, GL_STATIC_DRAW);
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 2, GL_FLOAT, GL_FALSE, sizeof(float) * 5, 0);
glEnableVertexAttribArray(1);
glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, sizeof(float) * 5, (char*)(sizeof(float) * 2));

GLushort indices = { 0,1,2, 0,3,4 };
GLuint indexBufferID;
glGenBuffers(1, &indexBufferID);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBufferID);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices, GL_STATIC_DRAW);

}

Hello,

you use explicit attribute locations, an OpenGL 3.3 / GLSL 330 feature. You might get this also with the extension on your device. The shader compiler should have given you an error at compile time, you might want to check those messages.

Does that mean I need to change the version in my shader to 330 or higher? It used to be a 430 as I was following a tutorial I found online, but I lowered it to 140 when I saw another forum post say the it should be lowered to whatever number the Intel build it. Setting the version to 330, 430, or anything else doesn’t work either. I still get white triangles with no color.

Also, I do not get an error in VS2010. If there is somewhere else I should look for a “shader compiler error”, please tell me where I can find that.

I figured out how to get shader compiler errors.

I didn’t get any errors with my fragment shader. However, the vertex shader compiler returned this error when I ran it using version 330 (It returned that version 430 is not supported by my OGL driver).

“ERROR: 0:2: ‘location’ : syntax error parse error”

When I lowered the version number to 130, I get this error:

“ERROR: 0:2: ‘(’ : syntax error parse error”

I’m not sure what this means. I don’t see a parenthesis out of place anywhere…please, help!

Try:

layout(location=0) in vec2 position;
layout(location=1) in vec3 vertexColor;

Using version 130 will not work at all as layout(location) is a GLSL 330 feature. I’d expect a syntax error in that case.

Malexander, when I tried what you recommended using version 330, the following error shows in the vertex shader compiler: “ERROR: 0:2: ‘location’ : syntax error parse error”.

Looking back at your OP, I see that the Intel GL implementation only supports GLSL 1.40. So, you’ll need to do:


#version 140
#extension GL_ARB_explicit_attrib_location : require

layout(location=0) in vec2 position;
layout(location=1) in vec3 vertexColor;                      

If it complains that GL_ARB_explicit_attrib_location is not supported, then you cannot use the layout(location=#) syntax. You’ll have to set the attribute location via glBindAttribLocation() in the main app, or use whatever attrib locations the compiler gives to you (query via glGetAttribLocation()).

I actually got this to work finally. Using #version 400, I completely took out the layout(location=0) parts because my hardware does not support that…so I just declare my variables as

in vec2 position;
in vec3 vertexColor;

And if I want to get the location of these variables in my shader, someone in my cpp file, I use the glGetAttribLocation() function.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.