Proplem with an example from the red book

I am trying to run an example piece of code from the openGL Red Book. the example is blendeqn.c and the error comes up with the glBlendEquation(); function. With the glBlendEquation(GL_MAX); function I get the error “`GL_MAX’ undeclared (first use this function)” With similar errors each time for each glBlendEquation function

oooo, goodie, i just did some work with this less than one hour ago (for a new NeHe lesson), so i got all the code for it just in front of me right now.

Ok, first you need to include glext.h because GL_MAX and all the other blend equations are defined there.
You can get the file from the extension page on openGL.org

Next you need to load the blend equation extension itself.
In windows you need to first create a global variable
PFNGLBLENDEQUATIONPROC glBlendEquation = NULL;

then amongst the first lines of gl code call

glBlendEquation = (PFNGLBLENDEQUATIONPROC) wglGetProcAddress(“glBlendEquation”);

and your done.

There are libraries out there (like glew ) that does just this for you.

Thanks. Still having problems though. Adding in the code means the program compiles alright but it throws up an “XXXX.exe has encountered a problem” error. Using glew it does not compile saying " [Linker error] undefined reference to `_imp____glewBlendEquation’"

you could try glBlendEquationEXT instead of glBlendEquation in wglGetProcAddress…

Hi, I am working on this too, I recommend use glew instead, it’s easy and if your video card supports you can use the extensions directly without any additional work.
I hope the following code may give you some help.

#include <iostream>
#include <windows.h>
#include <GL/glew.h>
#include <GL/glut.h>

using namespace std ;

void init(void)
{
	glClearColor(1.0, 1.0, 0.0, 0.0) ;
	glBlendFunc(GL_ONE, GL_ONE) ;
	glEnable(GL_BLEND) ;
}

void display(void)
{
	glClear(GL_COLOR_BUFFER_BIT) ;
	glColor3f(0.0, 0.0, 1.0) ;
	glRectf(-0.5, -0.5, 0.5, 0.5) ;
	glFlush() ;
}

void reshape(int w, int h)
{
	glViewport(0, 0, (GLsizei)w, (GLsizei)h) ;
	glMatrixMode(GL_PROJECTION) ;
	glLoadIdentity() ;
	if(w<=h)
		glOrtho(-1.5, 1.5, -1.5*(GLfloat)h/(GLfloat)w,
		1.5*(GLfloat)h/(GLfloat)w, -10.0, 10.0) ;
	else
		glOrtho(-1.5*(GLfloat)w/(GLfloat)h, 
		1.5*(GLfloat)w/(GLfloat)h, -1.5, 1.5, -10.0, 10.0) ;
	glMatrixMode(GL_MODELVIEW) ;
	glLoadIdentity() ;
}

void keyboard(unsigned char key, int x, int y)
{
	switch(key)
	{
	case 'a':
	case 'A':
		//Note: glBlendEquation is a subset of GL_ARB_imaging, please call glGetString 
		//first to confirm whether your video card support this extension.
		glBlendEquation(GL_FUNC_ADD) ;
		break ;
	case 's':
	case 'S':
		glBlendEquation(GL_FUNC_SUBTRACT) ;
		break ;
	case 'r':
	case 'R':
		glBlendEquation(GL_FUNC_REVERSE_SUBTRACT) ;
	case 'm':
	case 'M':
		glBlendEquation(GL_MIN) ;
		break ;
	case 'x':
	case 'X':
		glBlendEquation(GL_MAX) ;
		break ;
	case 27:
		exit(0) ;
		break ;
	}
	glutPostRedisplay() ;
}

int main(int argc, char **argv)
{
	
	glutInit(&argc, argv) ;
	glutInitDisplayMode(GLUT_SINGLE|GLUT_RGB) ;
	glutInitWindowSize(200, 200) ;
	glutCreateWindow("Blend") ;
	
	//initialization of glew.
	GLenum err = glewInit() ;
	if (GLEW_OK != err)
	{
		MessageBoxA(NULL, "error", "Initial error", 1) ;
	}
	//if the output value contains "GL_ARB_imaging " then you can use glBlendEquation
	const GLubyte *str = glGetString(GL_EXTENSIONS) ;
	cout << str << endl ;

	init() ;
	glutReshapeFunc(reshape) ;
	glutKeyboardFunc(keyboard) ;
	glutDisplayFunc(display) ;
	glutMainLoop() ;
	return 0 ;
}

No the different commands and new code both cause the same error. Could this be a problem with my graphics card?

I believe that the example, “blendeqn.c” is about OpenGL imaging subset. Make sure your card support “GL_ARB_imaging” extension first.

Sadly, all ATI Radeon cards never support this extension, except FireGL models.

Really? It’s so strange
If you place the code and the necessary files (glew32.lib, glew32.dll) correctly, I think it’s no problem to compile.
If your video card doesn’t support this extension, it will only cause a run-time error (crash), but it should be compiled with no error. Could you paste all of your code and the compiling error?

The code is the same as Debugger’s the error is

[Linker error] undefined reference to `_imp____glewBlendEquation'  

Repeated several times

I’m not sure about my grahics card. Anyway to check if my card supports the extension?

Have you linked glew32.lib? it seems this is lack of lib file error.

/Use the following code to check whether your video card support glBlendEquation,if the output value contains "GL_ARB_imaging " then you can use glBlendEquation/

const GLubyte *str = glGetString(GL_EXTENSIONS) ;
cout << str << endl ;

Guys, glBlendEquation was made CORE in OpenGL 1.4.

So if you have OpenGL 1.4+ you have glBlendEquation.
(This includes ATI’s from ages back)

(The GL_ARB_imaging extension came first and includes a lot of stuff, but only a subset of it was rolled into core)

Check you include libs as advised above for the link errors.

An quick google shows my ancient card (a S3 prosavageDDR) only has support for openGL 1.1. Looks like I’ll have to find another computer to practice on till I get a new one for uni later this year