Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 10 of 10

Thread: Invalid qualifiers 'in' in global variable

  1. #1
    Junior Member Newbie
    Join Date
    May 2013
    Posts
    6

    Question Invalid qualifiers 'in' in global variable

    Hi, i'm trying to compile a simple vertex shader, but I keep getting this error:
    "ERROR: 0:2: Invalid qualifiers 'in' in global variable context".

    Here is the code it's refering to:
    Code :
    const char* vertexSource =
    "#version 120\n"
    "in vec2 position;"
    "void main() {"
    "	gl_Position = vec4( position, 0.0, 1.0 );"
    "}";

    What am i doing wrong?

  2. #2
    Senior Member OpenGL Pro
    Join Date
    Jan 2012
    Location
    Australia
    Posts
    1,117
    You have set your version as 1.2 before "in" was valid. Change your version to 320

  3. #3
    Junior Member Newbie
    Join Date
    May 2013
    Posts
    6
    That didn't solve it. Now I get this error instead:
    "version '320' is not supported"

    The tutorial I'm reading uses version 1.50, but that does not seem to work either.
    I should say, that I'm working on a Mac (Mac OSX 10.8.3).

    Shouldn't 1.50 work on Mac?

  4. #4
    Senior Member OpenGL Pro
    Join Date
    Jan 2012
    Location
    Australia
    Posts
    1,117
    Try changing "in" to "varying" and leave the version the same.

  5. #5
    Junior Member Newbie
    Join Date
    May 2013
    Posts
    6
    Well, it doesn't complain anymore
    But still, nothing shows up...

    I have never tried using shaders before, so I have no idea, where something could be wrong.
    Can you try have a look at my code?

    Code :
    #include <GLUT/GLUT.h>
    #include <iostream>
    #include <vector>
    #include <math.h>
     
    #include "debug.h"
    #include "init.h"
    #include "userinteraction.h"
    #include "camera.h"
     
    using namespace std;
     
    // Global variables
    bool EXIT_LOOP = false;
    float pi = atan(1)*4;
     
    // Global Objects
    Debug debug;
    Camera camera;
     
     
     
     
     
     
    const char* vertexSource =
    "#version 120\n"
    "varying vec2 position;"
    "void main() {"
    "	gl_Position = vec4( position, 0.0, 1.0 );"
    "}";
     
    const char* fragmentSource =
    "#version 120\n"
    "varying vec4 outColor;"
    "void main() {"
    "	outColor = vec4( 1.0, 1.0, 1.0, 1.0 );"
    "}";
     
     
     
    void display(void)
    {
        keyOperations();
        keySpecialOperations();
     
     
        glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
        glMatrixMode(GL_MODELVIEW);
        glLoadIdentity();
        glTranslatef(5.0,5.0,5.0);
        glDrawArrays( GL_TRIANGLES, 0, 3 );
     
    	glutSwapBuffers();
     
        if (EXIT_LOOP) exit(0);
    }
     
     
    int main(int argc, char** argv) {
    	init(argc, argv);
     
     
     
     
     
        float vArray[] = {
    		0.0f, 0.5f,
    		0.5f, -0.5f,
    		-0.5f, -0.5f
    	};
     
        GLuint myvbo;
        glGenBuffers(1, &myvbo);
        glBindBuffer(GL_ARRAY_BUFFER, myvbo);
        glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);
     
        GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
        glShaderSource(vertexShader, 1, &vertexSource, NULL);
        glCompileShader(vertexShader);
     
        GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
        glShaderSource(fragmentShader, 1, &fragmentSource, NULL);
        glCompileShader(fragmentShader);
     
        GLuint shaderProgram = glCreateProgram();
        glAttachShader(shaderProgram, vertexShader);
        glAttachShader(shaderProgram, fragmentShader);
        glBindFragDataLocationEXT(shaderProgram, 0, "outColor");
        glLinkProgram(shaderProgram);
        glUseProgram(shaderProgram);
     
        GLint posAttrib = glGetAttribLocation(shaderProgram, "position");
        glVertexAttribPointer(posAttrib, 2, GL_FLOAT, GL_FALSE, 0, 0);
        glEnableVertexAttribArray(posAttrib);
     
        GLuint vao;
        glGenVertexArraysAPPLE(1, &vao);
        glBindVertexArrayAPPLE(vao);
     
     
     
     
     
     
        glutDisplayFunc(display);
        glutIdleFunc(display);
        glutReshapeFunc(handleResize);
     
        glutKeyboardFunc(keyPressed);
        glutKeyboardUpFunc(keyUp);
     
        glutMainLoop();
        return 0;
    }

  6. #6
    Senior Member OpenGL Pro
    Join Date
    Jan 2012
    Location
    Australia
    Posts
    1,117
    Try to find out what driver version you have and change the version to match. Try glGetString(GL_VERSION)

    If you want or need to stay with version 1.20, load this down and don't use anything that is not in here
    http://www.opengl.org/registry/doc/G...ull.1.20.8.pdf

    With version 1.2 you can only output colour to the system variable gl_FragColor

    Code :
    version 120
    void main() 
    {   
      gl_FragColor= vec4( 1.0, 1.0, 1.0, 1.0 );
    }

  7. #7
    Junior Member Newbie
    Join Date
    May 2013
    Posts
    6
    glGetString(GL_VERSION) returns "2.1"
    and
    glGetString(GL_SHADING_LANGUAGE_VERSION) returns "1.20"

    This is what Apple's app, "OpenGL Driver Monitor", tells me about my graphics card:
    NVIDIA GeForce 320M OpenGL Engine
    Vendor Name NVIDIA Corporation
    Core Profile Version 3.2 NVIDIA-8.10.44 304.10.65f03
    Core Profile GLSL Version 1.50
    Compatibility Version 2.1 NVIDIA-8.10.44 304.10.65f03
    Compatibility GLSL Version 1.20


    Apparently I'm in compatibility-mode...

    I followed Michael Dorner's example from http://stackoverflow.com/questions/1...2-core-profile.
    I just added GLUT_3_2_CORE_PROFILE to glutInitDisplayMode();
    Now it says I'm using 3.2 and 1.50 and it doesn't give an error when using 'in' and 'out'.

    But there is still no triangle on the screen
    Last edited by troels_y; 05-19-2013 at 06:27 AM.

  8. #8
    Advanced Member Frequent Contributor
    Join Date
    Mar 2009
    Location
    Singapore
    Posts
    800
    You should move the buffer object binding call after the vao binding call i.e
    Code :
     
    GLuint myvbo; glGenBuffers(1, &myvbo);
    //commented this to go after the VAO binding 
    //    glBindBuffer(GL_ARRAY_BUFFER, myvbo); 
    //    glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);      
     GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);    
    ///rest of your stuff
     
    GLuint vao; 
    glGenVertexArraysAPPLE(1, &vao); 
    glBindVertexArrayAPPLE(vao);
    glBindBuffer(GL_ARRAY_BUFFER, myvbo); 
    glBufferData(GL_ARRAY_BUFFER, sizeof(vArray), vArray, GL_STATIC_DRAW);

    See if this helps.
    Regards,
    Mobeen

  9. #9
    Junior Member Newbie
    Join Date
    May 2013
    Posts
    6
    No, that didn't make any difference

  10. #10
    Junior Member Newbie
    Join Date
    May 2013
    Posts
    6
    I threw GLUT in the trash and switched to GLFW instead, everything works now!
    Followed the tutorial at: www.open.gl

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •