First application, red book

Hi,
I tried to compile my first application (red book,chapter1) but there are too much error.
This is my github: GitHub - DavideZanin/OpenGL_001_triangolo
This is the structure:https://dl.dropboxusercontent.com/u/57344164/Schermata%202014-04-01%20alle%2000.39.24.png
This is the code:

#ifdef WIN32
#include <windows.h>
#endif

#include <iostream>
using namespace std;
#include <fstream>
#include <GLUT/glut.h>
#include <OpenGL/glu.h>
#include <OpenGL/gl.h>

enum VAO_IDs { Triangles, NumVAOs };
enum Buffer_IDs { ArrayBuffer, NumBuffers };
enum Attrib_IDs { vPosition = 0 };

GLuint VAOs[NumVAOs];
GLuint Buffers[NumBuffers];
const GLuint NumVertices = 6;

void init(void)
{
glGenVertexArrays(NumVAOs, VAOs);
glBindVertexArray(VAOs[Triangles]);
GLfloat vertices[NumVertices][2] = {
{ -0.90, -0.90 }, // Triangle 1
{ 0.85, -0.90 },
{ -0.90, 0.85 },
{ 0.90, -0.85 }, // Triangle 2
{ 0.90, 0.90 },
{ -0.85, 0.90 }
};

glGenBuffers(NumBuffers, Buffers);
glBindBuffer(GL_ARRAY_BUFFER, Buffers[ArrayBuffer]);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
ShaderInfo  shaders[] = {
    { GL_VERTEX_SHADER, "triangles.vert" },
    { GL_FRAGMENT_SHADER, "triangles.frag" },
    { GL_NONE, NULL }
};
GLuint program = LoadShaders(shaders);
glUseProgram(program);
glVertexAttribPointer(vPosition, 2, GL_FLOAT, GL_FALSE, 0, BUFFER_OFFSET(0));
glEnableVertexAttribArray(vPosition);
}

void display(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glBindVertexArray(VAOs[Triangles]);
glDrawArrays(GL_TRIANGLES, 0, NumVertices);
glFlush();

}

int
main(int argc, char** argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA);
glutInitWindowSize(512, 512);
glutInitContextVersion(4, 3);
glutInitContextProfile(GLUT_CORE_PROFILE);
glutCreateWindow(argv[0]);

if (glewInit()) {
    cerr &lt;&lt; "Unable to initialize GLEW ... exiting" &lt;&lt; endl; exit(EXIT_FAILURE);
}


init();
glutDisplayFunc(display);
glutMainLoop();
}

Errors:https://dl.dropboxusercontent.com/u/57344164/Schermata%202014-04-01%20alle%2001.43.11.png
What is the problem?

You are missing includes. Try to find how to setup a basic program setup for OpenGL on a Apple system.

From the error messages, it looks like you’re building on Mac OS X 10.9? They seem pretty serious about deprecating GLUT. I was successful using it with GL 3.2 the other day, but I’m not sure if anything newer is still possible. What I did was:
[ul]
[li]Include gl3.h before glut.h.
[/li][li]Afterwards, and still before including glut.h, define _gl_h. This prevents glut.h from including the old GL header, and then complaining about conflicts.
[/li][li]Add GLUT_3_2_CORE_PROFILE to the flags passed to glutInitDisplayMode().
[/li][li]Add a compiler flag to suppress the warnings about the deprecated GLUT entry points. Forgot the exact name, something like -Wno-deprecated-…
[/li][/ul]
I don’t see the glutInitContext*() calls defined in glut.h, so you’ll have to disable them for Mac OS.

ShaderInfo and BUFFER_OFFSET need to be defined somewhere in your code. I don’t think they are defined anywhere in a standard header.

Another possibility might be to use an alternate GLUT implementation. I believe there are open source implementations around.

Yes, I use osx Mavericks
If I include gl3.h: warning gl.h and gl3.h are both included. Compiler will not invoke errors if using removed OpenGL functionality. (I also tried to remove gl.h).
What should i write to prevents glut.h from including the old GL header?
And why the book used GL and not gl3? (It’s the bible of opengl!)
I used the last red book, openGL 4.3, and this is the first example, It’s so frustrating!! (If the example is from openGL 4.3 why something is deprecated???)
In addition, I’m a beginner so I can’t understand all the errors!

That’s where the second bullet item in my previous post comes in. I had success with the following sequence:


#include <OpenGL/gl3.h>
#define __gl_h_
#include <GLUT/glut.h>

The compiler option I used to make it shut up about the deprecated functions is -Wno-deprecated-declarations.

It’s not OpenGL itself that you’re having problems with, it’s the toolkit (GLUT) that they use for the examples. Certainly frustrating that the toolkit they use is not really supported anymore on your platform of choice. As I suggested in my previous post, there are open source implementations of GLUT, which might be worth trying. I know there’s something called FreeGLUT, but have no personal experience with it. This question goes in the same direction, and has some answers:

http://stackoverflow.com/questions/19648087/installing-freeglut-on-os-x-mavericks

[QUOTE=reto.koradi;1259046]That’s where the second bullet item in my previous post comes in. I had success with the following sequence:


#include <OpenGL/gl3.h>
#define __gl_h_
#include <GLUT/glut.h>

The compiler option I used to make it shut up about the deprecated functions is -Wno-deprecated-declarations.

It’s not OpenGL itself that you’re having problems with, it’s the toolkit (GLUT) that they use for the examples. Certainly frustrating that the toolkit they use is not really supported anymore on your platform of choice. As I suggested in my previous post, there are open source implementations of GLUT, which might be worth trying. I know there’s something called FreeGLUT, but have no personal experience with it. This question goes in the same direction, and has some answers:

http://stackoverflow.com/questions/19648087/installing-freeglut-on-os-x-mavericks[/QUOTE]

These are my errors/warnings now:
[ATTACH=CONFIG]615[/ATTACH]
What should I do?
My question now is: Is possible to use xcode and OpenGL 4.3?

Well, of course you can always use OpenGL the way Apple wants you to: Using Cocoa instead of GLUT. The initial learning curve will be higher, though, because you’ll have to pick up another non-trivial framework. An alternative is FreeGLUT. Then there’s also GLFW (http://www.glfw.org), which I haven’t used, but seems to become popular as a GLUT replacement.

Looking at your Xcode structure linked in the original post, you seem to have a pretty wild mix of different things. You have references to the OpenGL frameworks, but then you also have your own OpenGL headers. Similar for GLUT, where you have a reference to the standard framework, but you have FreeGLUT stuff contained in your project. Your code has a call to glewInit(), which is for a toolkit that is typically used under Windows.

I only use Xcode when I absolutely have to, so I’m not the right person to help you with that.

I grabbed the code from github link in your original post, and got most of it building on my Mac without much trouble. I don’t have the LoadShaders() code, so I temporarily disabled those calls. Which of course means that it wouldn’t render anything. But the code below builds without warnings or errors with this command line:


clang -Wno-deprecated-declarations glut.cpp -framework GLUT -framework OpenGL

This was with the following variation of your original code in a source file named glut.cpp. Sorry, I couldn’t resist cleaning up some formatting while I was editing it.


#include <OpenGL/gl3.h>
#define __gl_h_
#include <GLUT/glut.h>

enum VAO_IDs { Triangles, NumVAOs };
enum Buffer_IDs { ArrayBuffer, NumBuffers };
enum Attrib_IDs { vPosition = 0 };

GLuint  VAOs[NumVAOs];
GLuint  Buffers[NumBuffers];
const GLuint NumVertices = 6;

void init()
{
    glGenVertexArrays(NumVAOs, VAOs);
    glBindVertexArray(VAOs[Triangles]);
    GLfloat vertices[NumVertices][2] = {
        { -0.90, -0.90 },  // Triangle 1
        {  0.85, -0.90 },
        { -0.90,  0.85 },
        {  0.90, -0.85 },  // Triangle 2
        {  0.90,  0.90 },
        { -0.85,  0.90 }
    };

    glGenBuffers(NumBuffers, Buffers);
    glBindBuffer(GL_ARRAY_BUFFER, Buffers[ArrayBuffer]);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);

#if 0
    ShaderInfo shaders[] = {
        { GL_VERTEX_SHADER, "triangles.vert" },
        { GL_FRAGMENT_SHADER, "triangles.frag" },
        { GL_NONE, NULL }
    };

    GLuint program = LoadShaders(shaders);
#else
    GLuint program = 0;
#endif

    glUseProgram(program);
    glVertexAttribPointer(vPosition, 2, GL_FLOAT, GL_FALSE, 0, 0);
    glEnableVertexAttribArray(vPosition);
}

void display()
{
    glClear(GL_COLOR_BUFFER_BIT);
    glBindVertexArray(VAOs[Triangles]);
    glDrawArrays(GL_TRIANGLES, 0, NumVertices);
    glutSwapBuffers();
}

int main(int argc, char* argv[])
{
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_3_2_CORE_PROFILE);
    glutInitWindowSize(512, 512);
    glutCreateWindow(argv[0]);

    init();
    glutDisplayFunc(display);
    glutMainLoop();
}

[QUOTE=reto.koradi;1259052]Well, of course you can always use OpenGL the way Apple wants you to: Using Cocoa instead of GLUT. The initial learning curve will be higher, though, because you’ll have to pick up another non-trivial framework. An alternative is FreeGLUT. Then there’s also GLFW (http://www.glfw.org), which I haven’t used, but seems to become popular as a GLUT replacement.

Looking at your Xcode structure linked in the original post, you seem to have a pretty wild mix of different things. You have references to the OpenGL frameworks, but then you also have your own OpenGL headers. Similar for GLUT, where you have a reference to the standard framework, but you have FreeGLUT stuff contained in your project. Your code has a call to glewInit(), which is for a toolkit that is typically used under Windows.

I only use Xcode when I absolutely have to, so I’m not the right person to help you with that.

I grabbed the code from github link in your original post, and got most of it building on my Mac without much trouble. I don’t have the LoadShaders() code, so I temporarily disabled those calls. Which of course means that it wouldn’t render anything. But the code below builds without warnings or errors with this command line:


clang -Wno-deprecated-declarations glut.cpp -framework GLUT -framework OpenGL

This was with the following variation of your original code in a source file named glut.cpp. Sorry, I couldn’t resist cleaning up some formatting while I was editing it.


#include <OpenGL/gl3.h>
#define __gl_h_
#include <GLUT/glut.h>

enum VAO_IDs { Triangles, NumVAOs };
enum Buffer_IDs { ArrayBuffer, NumBuffers };
enum Attrib_IDs { vPosition = 0 };

GLuint  VAOs[NumVAOs];
GLuint  Buffers[NumBuffers];
const GLuint NumVertices = 6;

void init()
{
    glGenVertexArrays(NumVAOs, VAOs);
    glBindVertexArray(VAOs[Triangles]);
    GLfloat vertices[NumVertices][2] = {
        { -0.90, -0.90 },  // Triangle 1
        {  0.85, -0.90 },
        { -0.90,  0.85 },
        {  0.90, -0.85 },  // Triangle 2
        {  0.90,  0.90 },
        { -0.85,  0.90 }
    };

    glGenBuffers(NumBuffers, Buffers);
    glBindBuffer(GL_ARRAY_BUFFER, Buffers[ArrayBuffer]);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);

#if 0
    ShaderInfo shaders[] = {
        { GL_VERTEX_SHADER, "triangles.vert" },
        { GL_FRAGMENT_SHADER, "triangles.frag" },
        { GL_NONE, NULL }
    };

    GLuint program = LoadShaders(shaders);
#else
    GLuint program = 0;
#endif

    glUseProgram(program);
    glVertexAttribPointer(vPosition, 2, GL_FLOAT, GL_FALSE, 0, 0);
    glEnableVertexAttribArray(vPosition);
}

void display()
{
    glClear(GL_COLOR_BUFFER_BIT);
    glBindVertexArray(VAOs[Triangles]);
    glDrawArrays(GL_TRIANGLES, 0, NumVertices);
    glutSwapBuffers();
}

int main(int argc, char* argv[])
{
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_3_2_CORE_PROFILE);
    glutInitWindowSize(512, 512);
    glutCreateWindow(argv[0]);

    init();
    glutDisplayFunc(display);
    glutMainLoop();
}

[/QUOTE]

I tried freeglut and there are also problem in xcode.
No problem with freeglut in Windows+Visual Studio!
Sorry for your time! Thanks a lot!