glBindBuffers not declared, glew problem?

Hi, I wrote a piece of code using deprecated stuff to learn OpenGL (whit glfw, glm and gl3.h includes on linux). Now I want to replaces all glBegin/End stuff whit VBOs.

To do that I am trying to use glew as extension loader library.
I include the header before any glm stuff and I removed gl3.h includes. I din not forgot to link it. I also call glewInit() in my glfw class. (it does not matter because I cant compile).

Can someone explain me what am I doing wrong. I really thought including glew should have declared VBO related functions.

I also tried to use glext/glew combined since I saw that the functions I need are declared here. But it did not work out.

I will wait first response to include more code because I am pretty sure that I am doing something obviously silly.

here are includes of main’s file:


#include <stdlib.h>
#include <stdio.h>


#include "interface.h"//need to be first (include glew)
#include "gpu.h" <------------------------------ compilation error in gpu.cc 
#include "camera.h"
#include "inputmanager.h"
#include "Shader.h"
#include "triangle.h"

//to move
#define GLM_FORCE_RADIANS
#include <glm/gtc/type_ptr.hpp> 


Here are the includes in interface.h (no include in cc except interface.h first line)


#ifndef INTERFACE
#define INTERFACE
#include <GL/glew.h>
#include "inputmanager.h"
#include <GLFW/glfw3.h>
#include <stdlib.h>//error control
#include <stdio.h>

class InputManager; //forward declaration

Here is where I try to use VBO stuff in gpu.cc and where compilation fail


#include "gpu.h"

using namespace glm;
Gpu::Gpu(int maxRanderedTriangles){
	
	//TODO quand plusieurs shaders, avoir une classe shader manager plutot
	//lo camera Shader 
	
	shader=new Shader("view/drawer/gpu/Shaders/geometry.vert", "view/drawer/gpu/Shaders/texture.frag");
	shader->charger(); //compiler le chargeur et le mettre sur la carte

	//recuperation of the matrixs adress
	lookAtMatrixLocation = glGetUniformLocation(shader->getProgramID(), "lookAtMatrix");
	projectionMatrixLocation = glGetUniformLocation(shader->getProgramID(), "projectionMatrix");
	if (projectionMatrixLocation==-1 || lookAtMatrixLocation==-1){
		printf("error while loading uniform address");
		exit(1);
	}

	//Manage VBO//<----------------------------------------if I comment here to
	//get Id for the vbo
	glGenBuffers(1, &vboId);
	//locking new empty vbo
	glBindBuffers(GL_ARRAY_BUFFER, vboId);

	int neededSize= maxRanderedTriangles * 3 * sizeof(float);
	//allocating empty vram
	glBufferData(GL_BUFFER_DATA, neededSize, 0, GL_STREAM_DRAW);//<---------------------------here, everything compile and work fine 

}

Gpu::~Gpu(){
	glDeleteBuffers(1,&vboId);//<---------------------------------------------------------------- Apparently I got that function :??
	delete(shader);

}

void Gpu::transferProjectionMatrix(mat4 matrix){
	glUniformMatrix4fv(projectionMatrixLocation , 1, GL_FALSE, glm::value_ptr(matrix));
}

void Gpu::transferLookAtMatrix(mat4 matrix){
	glUniformMatrix4fv(lookAtMatrixLocation , 1, GL_FALSE, glm::value_ptr(matrix));
}

void Gpu::cameraMode(){
	glUseProgram(shader->getProgramID());
}

here is gpu.h:


#ifndef GPUGUARD
#define GPUGUARD

#include "Shader.h"
#include <glm/glm.hpp>

//to move in shader
#define GLM_FORCE_RADIANS
#include <glm/gtc/type_ptr.hpp> //cross plateform cast matrix to gpu

/*
 *
 * This class ensure gpu/cpu communication 
 *
 * */
class Gpu{
	public:
		Gpu(int maxRenderedTriangle);//compile camera shader, save Vram adress of constants, alloc bucket memory in both cpu/gpu side 
		~Gpu();	
		/*modify matrix in vram*/
		void transferLookAtMatrix(glm::mat4 matrix);
		void transferProjectionMatrix(glm::mat4 matrix);

		void cameraMode();//use camera Shaders
	private:
		Shader* shader;
		GLuint lookAtMatrixLocation;//TODO pass it in the camera shader
		GLuint projectionMatrixLocation;//idem

		GLuint vboId;
		GLuint vboLocation;
		float* bucket; 

};
#endif

Also, here is my makefile, prepare to scream (need to learn how to do proper ones, or learn Cmake instead)


CC=g++
CFLAGS=
LDFLAGS= -lGLEW -lGLU -lGL -lglfw -lrt -lm -ldl -Iincludes -Ilibs -g #-Wall -Wextra -pedantic -O2
EXEC=3D_world

all: $(EXEC)

$(EXEC): clean 
	@find . -name "*.h" -exec ln -s .{} -t includes/ \; 2> /dev/null #putting all headers in /includes
	@$(CC) $(CFLAGS) $(LDFLAGS) main.cc view/camera/camera.cc view/interface/inputmanager.cc view/interface/interface.cc view/drawer/gpu/Shader.cpp view/drawer/gpu/gpu.cc world/primitives/triangle.cc -o $(EXEC) 


.PHONY: clean debug
clean:
	@rm -rf *.o
	@rm -f $(EXEC)
	
debug: LDFLAGS += -DDEBUG
debug: $(EXEC) 




I can not tell if I cant resolve my problem because of my lack of competence in c++ or openGL. It could be both.
Saying that I am not an expert in c++ and compilation is an euphemism. This is also my first openGl program.

I hope you will find out.

There is no function named [var]glBindBuffers[/var] in any existing OpenGL version. From the context, I suspect that you want [var]glBindBuffer[/var] (without the trailing “s”).

OK, I am really sad now. I loosed like 3 hours on this stupid error.

I got lured into thinking it was a stranger problem because I also had


view/drawer/gpu/gpu.cc:27:15: error: ‘GL_BUFFER_DATA’ was not declared in this scope
  glBufferData(GL_BUFFER_DATA, neededSize, 0, GL_STREAM_DRAW);

but did not mention it.

I will advertise the author of https://openclassrooms.com/courses/developpez-vos-applications-3d-avec-opengl-3-3/les-vertex-buffer-objects-1#/id/r-964157
author

Well, thanks you very much.

I am super sorry, we should delete this thread.