PDA

View Full Version : Invalid qualifiers 'in' in global variable



troels_y
05-18-2013, 04:40 PM
Hi, i'm trying to compile a simple vertex shader, but I keep getting this error:
"ERROR: 0:2: Invalid qualifiers 'in' in global variable context".

Here is the code it's refering to:


const char* vertexSource =
"#version 120\n"
"in vec2 position;"
"void main() {"
" gl_Position = vec4( position, 0.0, 1.0 );"
"}";


What am i doing wrong?

tonyo_au
05-18-2013, 06:43 PM
You have set your version as 1.2 before "in" was valid. Change your version to 320

troels_y
05-19-2013, 01:45 AM
That didn't solve it. :( Now I get this error instead:
"version '320' is not supported"

The tutorial I'm reading uses version 1.50, but that does not seem to work either.
I should say, that I'm working on a Mac (Mac OSX 10.8.3).

Shouldn't 1.50 work on Mac?

tonyo_au
05-19-2013, 02:14 AM
Try changing "in" to "varying" and leave the version the same.

troels_y
05-19-2013, 04:19 AM
Well, it doesn't complain anymore :)
But still, nothing shows up...

I have never tried using shaders before, so I have no idea, where something could be wrong.
Can you try have a look at my code? :)



#include <GLUT/GLUT.h>
#include <iostream>
#include <vector>
#include <math.h>

#include "debug.h"
#include "init.h"
#include "userinteraction.h"
#include "camera.h"

using namespace std;

// Global variables
bool EXIT_LOOP = false;
float pi = atan(1)*4;

// Global Objects
Debug debug;
Camera camera;






const char* vertexSource =
"#version 120\n"
"varying vec2 position;"
"void main() {"
" gl_Position = vec4( position, 0.0, 1.0 );"
"}";

const char* fragmentSource =
"#version 120\n"
"varying vec4 outColor;"
"void main() {"
" outColor = vec4( 1.0, 1.0, 1.0, 1.0 );"
"}";



void display(void)
{
keyOperations();
keySpecialOperations();


glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(5.0,5.0,5.0);
glDrawArrays( GL_TRIANGLES, 0, 3 );

glutSwapBuffers();

if (EXIT_LOOP) exit(0);
}


int main(int argc, char** argv) {
init(argc, argv);





float vArray[] = {
0.0f, 0.5f,
0.5f, -0.5f,
-0.5f, -0.5f
};

GLuint myvbo;
glGenBuffers(1, &myvbo);
glBindBuffer(GL_ARRAY_BUFFER, myvbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);

GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
glShaderSource(vertexShader, 1, &vertexSource, NULL);
glCompileShader(vertexShader);

GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(fragmentShader, 1, &fragmentSource, NULL);
glCompileShader(fragmentShader);

GLuint shaderProgram = glCreateProgram();
glAttachShader(shaderProgram, vertexShader);
glAttachShader(shaderProgram, fragmentShader);
glBindFragDataLocationEXT(shaderProgram, 0, "outColor");
glLinkProgram(shaderProgram);
glUseProgram(shaderProgram);

GLint posAttrib = glGetAttribLocation(shaderProgram, "position");
glVertexAttribPointer(posAttrib, 2, GL_FLOAT, GL_FALSE, 0, 0);
glEnableVertexAttribArray(posAttrib);

GLuint vao;
glGenVertexArraysAPPLE(1, &vao);
glBindVertexArrayAPPLE(vao);






glutDisplayFunc(display);
glutIdleFunc(display);
glutReshapeFunc(handleResize);

glutKeyboardFunc(keyPressed);
glutKeyboardUpFunc(keyUp);

glutMainLoop();
return 0;
}

tonyo_au
05-19-2013, 06:43 AM
Try to find out what driver version you have and change the version to match. Try glGetString(GL_VERSION)

If you want or need to stay with version 1.20, load this down and don't use anything that is not in here
http://www.opengl.org/registry/doc/GLSLangSpec.Full.1.20.8.pdf

With version 1.2 you can only output colour to the system variable gl_FragColor



version 120
void main()
{
gl_FragColor= vec4( 1.0, 1.0, 1.0, 1.0 );
}

troels_y
05-19-2013, 07:19 AM
glGetString(GL_VERSION) returns "2.1"
and
glGetString(GL_SHADING_LANGUAGE_VERSION) returns "1.20"

This is what Apple's app, "OpenGL Driver Monitor", tells me about my graphics card:
NVIDIA GeForce 320M OpenGL Engine
Vendor Name NVIDIA Corporation
Core Profile Version 3.2 NVIDIA-8.10.44 304.10.65f03
Core Profile GLSL Version 1.50
Compatibility Version 2.1 NVIDIA-8.10.44 304.10.65f03
Compatibility GLSL Version 1.20

Apparently I'm in compatibility-mode...

I followed Michael Dorner's example from http://stackoverflow.com/questions/11259328/glut-on-os-x-with-opengl-3-2-core-profile.
I just added GLUT_3_2_CORE_PROFILE to glutInitDisplayMode();
Now it says I'm using 3.2 and 1.50 and it doesn't give an error when using 'in' and 'out'. :D

But there is still no triangle on the screen :(

mobeen
05-19-2013, 09:11 AM
You should move the buffer object binding call after the vao binding call i.e



GLuint myvbo; glGenBuffers(1, &myvbo);
//commented this to go after the VAO binding
// glBindBuffer(GL_ARRAY_BUFFER, myvbo);
// glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);
GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
///rest of your stuff

GLuint vao;
glGenVertexArraysAPPLE(1, &vao);
glBindVertexArrayAPPLE(vao);
glBindBuffer(GL_ARRAY_BUFFER, myvbo);
glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);


See if this helps.

troels_y
05-22-2013, 05:54 AM
No, that didn't make any difference :(

troels_y
05-22-2013, 03:11 PM
I threw GLUT in the trash and switched to GLFW instead, everything works now! :)
Followed the tutorial at: www.open.gl