Invalid qualifiers 'in' in global variable

Hi, i’m trying to compile a simple vertex shader, but I keep getting this error:
“ERROR: 0:2: Invalid qualifiers ‘in’ in global variable context”.

Here is the code it’s refering to:


const char* vertexSource =
"#version 120
"
"in vec2 position;"
"void main() {"
"	gl_Position = vec4( position, 0.0, 1.0 );"
"}";

What am i doing wrong?

You have set your version as 1.2 before “in” was valid. Change your version to 320

That didn’t solve it. :frowning: Now I get this error instead:
“version ‘320’ is not supported”

The tutorial I’m reading uses version 1.50, but that does not seem to work either.
I should say, that I’m working on a Mac (Mac OSX 10.8.3).

Shouldn’t 1.50 work on Mac?

Try changing “in” to “varying” and leave the version the same.

Well, it doesn’t complain anymore :slight_smile:
But still, nothing shows up…

I have never tried using shaders before, so I have no idea, where something could be wrong.
Can you try have a look at my code? :slight_smile:


#include <GLUT/GLUT.h>
#include <iostream>
#include <vector>
#include <math.h>

#include "debug.h"
#include "init.h"
#include "userinteraction.h"
#include "camera.h"

using namespace std;

// Global variables
bool EXIT_LOOP = false;
float pi = atan(1)*4;

// Global Objects
Debug debug;
Camera camera;






const char* vertexSource =
"#version 120
"
"varying vec2 position;"
"void main() {"
"	gl_Position = vec4( position, 0.0, 1.0 );"
"}";

const char* fragmentSource =
"#version 120
"
"varying vec4 outColor;"
"void main() {"
"	outColor = vec4( 1.0, 1.0, 1.0, 1.0 );"
"}";



void display(void)
{
    keyOperations();
    keySpecialOperations();
    
    
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glMatrixMode(GL_MODELVIEW);
    glLoadIdentity();
    glTranslatef(5.0,5.0,5.0);
    glDrawArrays( GL_TRIANGLES, 0, 3 );

	glutSwapBuffers();
    
    if (EXIT_LOOP) exit(0);
}


int main(int argc, char** argv) {
	init(argc, argv);
    
    
    
    
    
    float vArray[] = {
		0.0f, 0.5f,
		0.5f, -0.5f,
		-0.5f, -0.5f
	};
    
    GLuint myvbo;
    glGenBuffers(1, &myvbo);
    glBindBuffer(GL_ARRAY_BUFFER, myvbo);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);
    
    GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
    glShaderSource(vertexShader, 1, &vertexSource, NULL);
    glCompileShader(vertexShader);
    
    GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
    glShaderSource(fragmentShader, 1, &fragmentSource, NULL);
    glCompileShader(fragmentShader);
    
    GLuint shaderProgram = glCreateProgram();
    glAttachShader(shaderProgram, vertexShader);
    glAttachShader(shaderProgram, fragmentShader);
    glBindFragDataLocationEXT(shaderProgram, 0, "outColor");
    glLinkProgram(shaderProgram);
    glUseProgram(shaderProgram);
    
    GLint posAttrib = glGetAttribLocation(shaderProgram, "position");
    glVertexAttribPointer(posAttrib, 2, GL_FLOAT, GL_FALSE, 0, 0);
    glEnableVertexAttribArray(posAttrib);
    
    GLuint vao;
    glGenVertexArraysAPPLE(1, &vao);
    glBindVertexArrayAPPLE(vao);
    

    
    
    
    
    glutDisplayFunc(display);
    glutIdleFunc(display);
    glutReshapeFunc(handleResize);
    
    glutKeyboardFunc(keyPressed);
    glutKeyboardUpFunc(keyUp);
    
    glutMainLoop();
    return 0;
}

Try to find out what driver version you have and change the version to match. Try glGetString(GL_VERSION)

If you want or need to stay with version 1.20, load this down and don’t use anything that is not in here
http://www.opengl.org/registry/doc/GLSLangSpec.Full.1.20.8.pdf

With version 1.2 you can only output colour to the system variable gl_FragColor


version 120
void main() 
{   
  gl_FragColor= vec4( 1.0, 1.0, 1.0, 1.0 );
} 

glGetString(GL_VERSION) returns “2.1”
and
glGetString(GL_SHADING_LANGUAGE_VERSION) returns “1.20”

This is what Apple’s app, “OpenGL Driver Monitor”, tells me about my graphics card:
[NOTE]NVIDIA GeForce 320M OpenGL Engine
Vendor Name NVIDIA Corporation
Core Profile Version 3.2 NVIDIA-8.10.44 304.10.65f03
Core Profile GLSL Version 1.50
Compatibility Version 2.1 NVIDIA-8.10.44 304.10.65f03
Compatibility GLSL Version 1.20[/NOTE]

Apparently I’m in compatibility-mode…

I followed Michael Dorner’s example from macos - GLUT on OS X with OpenGL 3.2 Core Profile - Stack Overflow.
I just added GLUT_3_2_CORE_PROFILE to glutInitDisplayMode();
Now it says I’m using 3.2 and 1.50 and it doesn’t give an error when using ‘in’ and ‘out’. :smiley:

But there is still no triangle on the screen :frowning:

You should move the buffer object binding call after the vao binding call i.e



GLuint myvbo; glGenBuffers(1, &myvbo);
//commented this to go after the VAO binding 
//    glBindBuffer(GL_ARRAY_BUFFER, myvbo); 
//    glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);      
 GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);    
///rest of your stuff

GLuint vao; 
glGenVertexArraysAPPLE(1, &vao); 
glBindVertexArrayAPPLE(vao);
glBindBuffer(GL_ARRAY_BUFFER, myvbo); 
glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW); 

See if this helps.

No, that didn’t make any difference :frowning:

I threw GLUT in the trash and switched to GLFW instead, everything works now! :slight_smile:
Followed the tutorial at: www.open.gl