Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 7 of 7

Thread: Black Screen -- Should Be a trianlge

  1. #1
    Junior Member Newbie
    Join Date
    Sep 2011
    Posts
    26

    Black Screen -- Should Be a trianlge

    Hello,

    I am using OpenGL 3.2, OS X 10.8, and Xcode 4.6. I am using glfw for my window management. I am trying to draw a simple triangle using this tutorial:
    open.gl

    However, I don't get a triangle -- just a black screen. I get no shader compilation errors or language errors.
    Here is my code:

    Code :
     
     
    float vertices[] = {
        0.0f,  0.5f, // Vertex 1 (X, Y)
        0.5f, -0.5f, // Vertex 2 (X, Y)
        -0.5f, -0.5f  // Vertex 3 (X, Y)
    };
     
     
     
     
    const char* vertexSource =
    "#version 150\n"
    "in vec2 position;"
    "in vec3 color;"
    "out vec3 Color;"
    "void main() {"
    "	Color = color;"
    "	gl_Position = vec4( position, 0.0, 1.0 );"
    "}";
     
     
     
     
    const char* fragmentSource =
    "#version 150\n"
    "in vec3 Color;"
    "out vec4 outColor;"
    "void main() {"
    "	outColor = vec4( Color, 1.0 );"
    "}";
     
     
     
     
     
     
    int main(int argc, const char * argv[])
    {
        unsigned char didInit = glfwInit();
     
        if (didInit == GL_FALSE){
     
            printf("Failed to init glfw\n");
            exit(1);
     
        }else{
     
            printf("glfw was initiated\n");
     
        }
     
     
        glfwOpenWindowHint(GLFW_OPENGL_VERSION_MAJOR, 3);
        glfwOpenWindowHint(GLFW_OPENGL_VERSION_MINOR, 2);
        glfwOpenWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
        glfwOpenWindowHint(GLFW_WINDOW_NO_RESIZE, GL_TRUE);
     
        glfwOpenWindow(800, 600, 0, 0, 0, 0, 0, 0, GLFW_WINDOW);
     
     
        glewExperimental = true;
     
     
     
        unsigned char  didGlewInit = glewInit();
     
        if (didGlewInit == GLEW_OK) {
            printf("glew is ready\n");
        }else{
            printf("failed to init glew\n");
        }
     
     
        glfwSetWindowTitle("simple-gl");
     
        glfwEnable(GLFW_STICKY_KEYS);
     
        GLuint vao;
        glGenVertexArraysAPPLE(1, &vao);
        glBindVertexArrayAPPLE(vao);
     
     
     
     
        GLuint vbo;
        glGenBuffers(1, &vbo);
        glBindBuffer(GL_ARRAY_BUFFER, vbo);
        glBufferData(GL_ARRAY_BUFFER, sizeof(vertices),vertices,GL_STATIC_DRAW);
     
        GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER);
        glShaderSource(vertexShader,1,&vertexSource,NULL);
        glCompileShader(vertexShader);
     
        GLint status;
     
     
     
        glGetShaderiv( vertexShader, GL_COMPILE_STATUS, &status );
     
        if(!status){
     
        char buffer[512];
        glGetShaderInfoLog( vertexShader, 512, NULL, buffer );
            std::cout << "Error occured" << std::endl;
        }
     
        GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
        glShaderSource(fragmentShader, 1, &fragmentSource,NULL);
        glCompileShader(fragmentShader);
     
        GLint otherStatus;
        glGetShaderiv( fragmentShader, GL_COMPILE_STATUS, &otherStatus );
     
        if(!otherStatus){
            char buffer[512];
            glGetShaderInfoLog( fragmentShader, 512, NULL, buffer );
            std::cout << "Error occured" << std::endl;
     
     
        }
     
     
        GLuint shaderProgram = glCreateProgram();
        glAttachShader(shaderProgram, vertexShader);
        glAttachShader(shaderProgram, fragmentShader);
        glBindFragDataLocation(shaderProgram,0, "outColor");
     
        glLinkProgram(shaderProgram);
     
        glUseProgram(shaderProgram);
     
        GLint posAttrib = glGetAttribLocation(shaderProgram, "position");
        glVertexAttribPointer(posAttrib, 2, GL_FLOAT, GL_FALSE, 0,0);
        glEnableVertexAttribArray(posAttrib);
     
     
        while( glfwGetWindowParam( GLFW_OPENED ) )
        {
            //Drawing code here
            glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
            glClear(GL_COLOR_BUFFER_BIT);
            glDrawArrays(GL_TRIANGLES, 0, 3);
            glfwSwapBuffers();
     
            if (glfwGetKey(GLFW_KEY_ESC)==GLFW_PRESS) {
                break;
            }
     
        }
     
     
        return 0;
    }

    The code is fairly simple and I can't find anything in the docs that makes my code incorrect.

    If anyone could take a look at my code, that would be really great!

    Thanks!

  2. #2
    Advanced Member Frequent Contributor arekkusu's Avatar
    Join Date
    Nov 2003
    Posts
    761
    Check glGetError() (or "Break on Error" in OpenGL Profiler.)

    If glew/glfw properly set up your function pointers and your context, then
    glGenVertexArraysAPPLE() should throw INVALID_OPERATION in a Core Profile context. You should use glGenVertexArrays() instead (no, these API are not the same, because of the must-gen-to-bind semantic enforced by Core profile.)


  3. #3
    Junior Member Newbie
    Join Date
    Sep 2011
    Posts
    26
    Ok, I'm not using the APPLE() calls anymore. I am getting OpenGL Error 1280 when I put this code above glGenVertexArrays():

    Code :
     
        GLenum err;
        while ((err = glGetError()) != GL_NO_ERROR) {
            std::cerr << "OpenGL error: " << err << std::endl;
        }

    I'm not seeing any invalid enums, though.

    Thanks for your help.

  4. #4
    Junior Member Newbie
    Join Date
    Sep 2011
    Posts
    26
    I just now got OpenGL profiler downloaded and here is the detail on my invalid enum:

    Code :
    glGetString(GL_EXTENSIONS); returns: 0x00000000 
    	Error: GL_INVALID_ENUM
    	Context: 0x7f981c828800
    	Virtual Screen:  0/2
    	kCGLCPCurrentRendererID:  16918030 (0x0102260e)
    	GL_RENDERER:  NVIDIA GeForce 9400M OpenGL Engine
    	GL_VENDOR:  NVIDIA Corporation
    	GL_VERSION:  3.2 NVIDIA-8.12.47 310.40.00.05f01
    	kCGLCPGPUFragmentProcessing:  GL_TRUE
    	kCGLCPGPUVertexProcessing:  GL_TRUE
    Function call stack: 
    	0: 0x109fa780a in <libGLEW.1.9.0.dylib> 
    	1: 0x109f9c30d in main at main.cpp: 71 
    	2: 0x7fff897d37e1 in start in <libdyld.dylib> 
    	3: 0x00000001 in <Open.GL>

    I'm not exactly sure what this means, does it help at all?

  5. #5
    Advanced Member Frequent Contributor arekkusu's Avatar
    Join Date
    Nov 2003
    Posts
    761
    That means GLEW is broken. You can not query GL_EXTENSIONS in Core Profile. Instead you must use glGetStringi() to query individual extension strings.
    I'm not sure what GLEW is doing with the extension checking, but that error by itself shouldn't break your rendering.

    It looks like you don't set up an input array for your "color" attribute, so the data for that attribute will come from the "current" vertex. Which defaults to (0,0,0,1).
    You've also set the clear color to (0,0,0,1) so you're drawing a black triangle on a black background. Change either the clear color or the triangle color and you should see something.

  6. #6

  7. #7
    Junior Member Newbie
    Join Date
    Sep 2011
    Posts
    26
    Quote Originally Posted by arekkusu View Post
    It looks like you don't set up an input array for your "color" attribute, so the data for that attribute will come from the "current" vertex. Which defaults to (0,0,0,1).
    You've also set the clear color to (0,0,0,1) so you're drawing a black triangle on a black background. Change either the clear color or the triangle color and you should see something.

    This worked, I must have made a mistake while following the tutorial. Thanks!

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •