Works on my machine...

I’m trying to start a small project in OpenGL coming from D3D.

My machine runs windows vista, nvidia 9xxx series card (supports opengl 3.3 natively), latest bleeding edge whql drivers. The other machine I tried to test my application on has an ATI 5xxx series card (4.2 natively).

In my app, I just request a pixel format that enables the latest possible forward compatible format, on my machine, that’s ogl 3.3, and on his, it’s 4.2.

I don’t do anything fancy, it is simply a vertex array object with two vertex buffer objects, one for verticies, one for colors. It works perfect on my machine, and on his, the small square shape won’t appear. I know there are differences between ati and nVidia, but something so simple? :frowning:

    
    vertices[0] = -0.5; vertices[1] = -0.5; vertices[2] = -3.0; 
    vertices[3] = -0.5; vertices[4] = 0.5; vertices[5] = -3.0; 
    vertices[6] = 0.5; vertices[7] = 0.5; vertices[8] = -3.0; 

    vertices[9] = 0.5; vertices[10] = -0.5; vertices[11] = -3.0; 
    vertices[12] = -0.5; vertices[13] = -0.5; vertices[14] = -3.0; 
    vertices[15] = 0.5; vertices[16] = 0.5; vertices[17] = -3.0;

    vertices[18] = -0.5; vertices[19] = -0.5; vertices[20] = -4.0;
    vertices[21] = -0.5; vertices[22] = 0.5; vertices[23] = -4.0;
    vertices[24] = -0.5; vertices[25] = 0.5; vertices[26] = -3.0;

    vertices[27] = -0.5; vertices[28] = -0.5; vertices[29] = -3.0;
    vertices[30] = -0.5; vertices[31] = -0.5; vertices[32] = -4.0;
    vertices[33] = -0.5; vertices[34] = 0.5; vertices[35] = -3.0;

    colors[0] = 0.0; colors[1] = 0.0; colors[2] = 0.0; colors[3] = 1.0;
    colors[4] = 0.0; colors[5] = 0.0; colors[6] = 1.0; colors[7] = 1.0;
    colors[8] = 0.0; colors[9] = 0.0; colors[10] = 1.0; colors[11] = 1.0;

    colors[12] = 0.0; colors[13] = 0.0; colors[14] = 0.0; colors[15] = 1.0;
    colors[16] = 0.0; colors[17] = 0.0; colors[18] = 0.0; colors[19] = 1.0;
    colors[20] = 0.0; colors[21] = 0.0; colors[22] = 1.0; colors[23] = 1.0;
    X77Shaders(); // This just init's the shaders, which on both machines compile+link with no errors

    glBindAttribLocation(shaderhandle, 0, "in_Position");
    glBindAttribLocation(shaderhandle, 1, "in_Color");
    glGenBuffers(2, vbo);
    glGenVertexArrays(1, &vao); 
    glBindVertexArray(vao); 

     
    glBindBuffer(GL_ARRAY_BUFFER, vbo[0]); 
    glBufferData(GL_ARRAY_BUFFER, 36 * sizeof(GLfloat), vertices, GL_STATIC_DRAW);
    glEnableVertexAttribArray(vertexloc);
    glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 0, 0);

    glBindBuffer(GL_ARRAY_BUFFER, vbo[1]);
    glBufferData(GL_ARRAY_BUFFER, 24 * sizeof(GLfloat), colors, GL_STATIC_DRAW);
    glEnableVertexAttribArray(1);
    glVertexAttribPointer(1, 4, GL_FLOAT, GL_TRUE, 0, 0);


    glBindVertexArray(0);

the pre-render loop stuff:

glEnable(GL_DEPTH_TEST);
    glCullFace(GL_BACK);
    glEnable(GL_CULL_FACE);

    /* enable OpenGL for the window */
    EnableOpenGL(hwnd, &hDC, &hRC);


    glViewport(0, 0, 640, 480);
    glClearColor(0.0f, 0.0f, 0.2f, 0.0f);

    glBindVertexArray(vao);  //I never unbind this again until the program quits, so I don't put this in the loop

the render loop is just glClear(depth | color), and glDrawArrays(GL_TRIANGLES, 0, 12), as I’m drawing 4 triangles with 3 verticies each.

I check for errors during init, in the render loop, and while compiling+linking the shaders: there are no errors for either of us.

What gives?

The attribute locations you provide with glBindAttribLocation only come into effect for the shader program when glLinkProgram is called, and are then fixed until the next time it is linked.

Thank you very much Dan, all I needed to do was move the attribute binds to the point before the shader was linked. I appreciate the help.

It does interest me that it worked with my nividia card/drivers and failed with his ati card/drivers…any idea how that could have occured?

It probably just happened that [LEFT]in_Position was assigned location 0 + in_Color was assigned location 1 on NVidia so it worked, whereas ATi might have assigned different values.[/LEFT]

Makes sense, and again, thanks.

[QUOTE=djohnson3278;1239636]
It does interest me that it worked with my nividia card/drivers and failed with his ati card/drivers…any idea how that could have occured?[/QUOTE]

The linker provides attrib-locations unless you provide some explicitly via the above calls or explicit attrib location in the shader itself (starting with OpenGL 3.3). The linker provided locations will not collide (each attribute has its own location) but apart from that the order is not defined. Nvidia and AMD are correct here - its not defined, they do it ‘somehow’. Nvidia orders them in the order of appearance in the source, AMD alphabetically (note that this might change in the future, which is fine and legal), Apple? I don’t know how they do it per default.

Bottom note: If you don’t provide the locations and your app runs anyway, you just got lucky (or unlucky as this hides an ugly bug actually).