Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 3 of 3

Thread: glBindBuffers not declared, glew problem?

  1. #1
    Newbie Newbie
    Join Date
    Feb 2018
    Posts
    2

    glBindBuffers not declared, glew problem?

    Hi, I wrote a piece of code using deprecated stuff to learn OpenGL (whit glfw, glm and gl3.h includes on linux). Now I want to replaces all glBegin/End stuff whit VBOs.

    To do that I am trying to use glew as extension loader library.
    I include the header before any glm stuff and I removed gl3.h includes. I din not forgot to link it. I also call glewInit() in my glfw class. (it does not matter because I cant compile).

    Can someone explain me what am I doing wrong. I really thought including glew should have declared VBO related functions.

    I also tried to use glext/glew combined since I saw that the functions I need are declared here. But it did not work out.

    I will wait first response to include more code because I am pretty sure that I am doing something obviously silly.

    here are includes of main's file:
    Code :
    #include <stdlib.h>
    #include <stdio.h>
     
     
    #include "interface.h"//need to be first (include glew)
    #include "gpu.h" <------------------------------ compilation error in gpu.cc 
    #include "camera.h"
    #include "inputmanager.h"
    #include "Shader.h"
    #include "triangle.h"
     
    //to move
    #define GLM_FORCE_RADIANS
    #include <glm/gtc/type_ptr.hpp>

    Here are the includes in interface.h (no include in cc except interface.h first line)
    Code :
    #ifndef INTERFACE
    #define INTERFACE
    #include <GL/glew.h>
    #include "inputmanager.h"
    #include <GLFW/glfw3.h>
    #include <stdlib.h>//error control
    #include <stdio.h>
     
    class InputManager; //forward declaration

    Here is where I try to use VBO stuff in gpu.cc and where compilation fail
    Code :
    #include "gpu.h"
     
    using namespace glm;
    Gpu::Gpu(int maxRanderedTriangles){
     
    	//TODO quand plusieurs shaders, avoir une classe shader manager plutot
    	//lo camera Shader 
     
    	shader=new Shader("view/drawer/gpu/Shaders/geometry.vert", "view/drawer/gpu/Shaders/texture.frag");
    	shader->charger(); //compiler le chargeur et le mettre sur la carte
     
    	//recuperation of the matrixs adress
    	lookAtMatrixLocation = glGetUniformLocation(shader->getProgramID(), "lookAtMatrix");
    	projectionMatrixLocation = glGetUniformLocation(shader->getProgramID(), "projectionMatrix");
    	if (projectionMatrixLocation==-1 || lookAtMatrixLocation==-1){
    		printf("error while loading uniform address");
    		exit(1);
    	}
     
    	//Manage VBO//<----------------------------------------if I comment here to
    	//get Id for the vbo
    	glGenBuffers(1, &vboId);
    	//locking new empty vbo
    	glBindBuffers(GL_ARRAY_BUFFER, vboId);
     
    	int neededSize= maxRanderedTriangles * 3 * sizeof(float);
    	//allocating empty vram
    	glBufferData(GL_BUFFER_DATA, neededSize, 0, GL_STREAM_DRAW);//<---------------------------here, everything compile and work fine 
     
    }
     
    Gpu::~Gpu(){
    	glDeleteBuffers(1,&vboId);//<---------------------------------------------------------------- Apparently I got that function :??
    	delete(shader);
     
    }
     
    void Gpu::transferProjectionMatrix(mat4 matrix){
    	glUniformMatrix4fv(projectionMatrixLocation , 1, GL_FALSE, glm::value_ptr(matrix));
    }
     
    void Gpu::transferLookAtMatrix(mat4 matrix){
    	glUniformMatrix4fv(lookAtMatrixLocation , 1, GL_FALSE, glm::value_ptr(matrix));
    }
     
    void Gpu::cameraMode(){
    	glUseProgram(shader->getProgramID());
    }

    here is gpu.h:
    Code :
    #ifndef GPUGUARD
    #define GPUGUARD
     
    #include "Shader.h"
    #include <glm/glm.hpp>
     
    //to move in shader
    #define GLM_FORCE_RADIANS
    #include <glm/gtc/type_ptr.hpp> //cross plateform cast matrix to gpu
     
    /*
     *
     * This class ensure gpu/cpu communication 
     *
     * */
    class Gpu{
    	public:
    		Gpu(int maxRenderedTriangle);//compile camera shader, save Vram adress of constants, alloc bucket memory in both cpu/gpu side 
    		~Gpu();	
    		/*modify matrix in vram*/
    		void transferLookAtMatrix(glm::mat4 matrix);
    		void transferProjectionMatrix(glm::mat4 matrix);
     
    		void cameraMode();//use camera Shaders
    	private:
    		Shader* shader;
    		GLuint lookAtMatrixLocation;//TODO pass it in the camera shader
    		GLuint projectionMatrixLocation;//idem
     
    		GLuint vboId;
    		GLuint vboLocation;
    		float* bucket; 
     
    };
    #endif

    Also, here is my makefile, prepare to scream (need to learn how to do proper ones, or learn Cmake instead)
    Code :
    CC=g++
    CFLAGS=
    LDFLAGS= -lGLEW -lGLU -lGL -lglfw -lrt -lm -ldl -Iincludes -Ilibs -g #-Wall -Wextra -pedantic -O2
    EXEC=3D_world
     
    all: $(EXEC)
     
    $(EXEC): clean 
    	@find . -name "*.h" -exec ln -s .{} -t includes/ \; 2> /dev/null #putting all headers in /includes
    	@$(CC) $(CFLAGS) $(LDFLAGS) main.cc view/camera/camera.cc view/interface/inputmanager.cc view/interface/interface.cc view/drawer/gpu/Shader.cpp view/drawer/gpu/gpu.cc world/primitives/triangle.cc -o $(EXEC) 
     
     
    .PHONY: clean debug
    clean:
    	@rm -rf *.o
    	@rm -f $(EXEC)
     
    debug: LDFLAGS += -DDEBUG
    debug: $(EXEC)

    I can not tell if I cant resolve my problem because of my lack of competence in c++ or openGL. It could be both.
    Saying that I am not an expert in c++ and compilation is an euphemism. This is also my first openGl program.

    I hope you will find out.

  2. #2
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,925
    Quote Originally Posted by ninjaconcombre View Post
    Can someone explain me what am I doing wrong.

    Code :
    	glBindBuffers(GL_ARRAY_BUFFER, vboId);
    There is no function named glBindBuffers in any existing OpenGL version. From the context, I suspect that you want glBindBuffer (without the trailing "s").

  3. #3
    Newbie Newbie
    Join Date
    Feb 2018
    Posts
    2
    Quote Originally Posted by GClements View Post
    There is no function named glBindBuffers in any existing OpenGL version. From the context, I suspect that you want glBindBuffer (without the trailing "s").
    OK, I am really sad now. I loosed like 3 hours on this stupid error.

    I got lured into thinking it was a stranger problem because I also had
    Code :
    view/drawer/gpu/gpu.cc:27:15: error: ‘GL_BUFFER_DATA’ was not declared in this scope
      glBufferData(GL_BUFFER_DATA, neededSize, 0, GL_STREAM_DRAW);

    but did not mention it.

    I will advertise the author of https://openclassrooms.com/courses/d...1#/id/r-964157
    author

    Well, thanks you very much.

    I am super sorry, we should delete this thread.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •