Color table problems

Well I’ve been going through the red book awhile and everythings been going ok until I hit page 328. It shows you how an image can be inverted using color tables. the code is shown below.

#include <GL/glut.h>
#include <stdlib.h>

extern GLubyte* readImage(const char*, GLsizei*, GLsizei*);

GLubyte *pixels;
GLsizei width, height;

void init(void)
{
int i;
GLubyte colorTable[256][3];

pixels = readImage("Pic.bmp", &width, &height);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glClearColor(0.0, 0.0, 0.0, 0.0);
glShadeModel(GL_SMOOTH);
for(i = 0; i &lt; 256; ++i)
{
	colorTable[i][0] = 255 - i;
	colorTable[i][1] = 255 - i;
	colorTable[i][2] = 255 - i;
}

  :confused:  glGetColorTable(GL_COLOR_TABLE, GL_RGB, 256, GL_RGB, GL_UNSIGNED_BYTE, colorTable);  :confused:  
  :confused:  glEnable(GL_COLOR_TABLE);  :confused:  

}

void display(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glRasterPos2i(3, 3);
glDrawPixels(width, height, GL_RGB, GL_UNSIGNED_BYTE, pixels);
glFlush();
}

I ran the code and it gave two errors. shown here.

error C2065: ‘glGetColorTable’ : undeclared identifier
error C2065: ‘GL_COLOR_TABLE’ : undeclared identifier

They point to where I put the faces. Being new to images I have one idea why its wrong. I heard something about you need to convert the image into somesort of format for open gl to display the image. If thats true you’ll probly see me posting on how to format pictures later. Though I don’t think thats the problem. Anyone care to shed some light on this.

TBOY

The functions you call are introduced in 2 extensions, namely EXT_paletted_texture and SGI_color_table. These extensions are however not supported in the recent generation of GPUs.
Here’s an extract from the EXT_paletted_texture specification:

Support

Intel 810/815.
Mesa.
Microsoft software OpenGL implementation.
Selected NVIDIA GPUs: NV1x (GeForce 256, GeForce2, GeForce4 MX,
GeForce4 Go, Quadro, Quadro2), NV2x (GeForce3, GeForce4 Ti,
Quadro DCC, Quadro4 XGL), and NV3x (GeForce FX 5xxxx, Quadro FX
1000/2000/3000).  NV3 (Riva 128) and NV4 (TNT, TNT2) GPUs and NV4x
GPUs do NOT support this functionality (no hardware support).
Future NVIDIA GPU designs will no longer support paletted textures.
S3 ProSavage, Savage 2000.
3Dfx Voodoo3, Voodoo5.
3Dlabs GLINT.

I checked the extensions on my Ati Radeon and none of these two extensions are supported.
If you want to use this functionality you could resort to software rendering with Mesa or Mangled Mesa which combines hardware and software rendering if I’m not mistaking.
It’s possible that I overlooked something, but I checked all GL files and the omly file I could find a reference to these functions in, was the gl.h file. This is most likely to allow hardware support for older GPUs.

Greetz,

Nico

Thanks for the tip. I’ll start looking at mesa and GPA’s.

My bad I ment GPU’s.

I looked into this a little learned a little more and decided color tables isn’t needed for what I’m trying to learn.

As of right now I’m trying to display a file called Pic.bmp. My new code looks like this.

#include <GL/glut.h>
#include <stdlib.h>
#include <stdio.h>

extern GLubyte* readImage(const char*, GLsizei*, GLsizei*);
GLubyte* pixels;
GLsizei width, height;

void init(void)
{
pixels = readImage(“Data/Pic.bin”, &width, &height);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glClearColor(0.0, 0.0, 0.0, 0.0);
}

void display(void)
{
glClear(GL_COLOR_BUFFER_BIT);
glRasterPos2i(10, 10);
glDrawPixels(256, 256, GL_RGB, GL_UNSIGNED_BYTE, pixels);
glFlush();
}

Unfortunatly I’m still getting 2 errors seen below.

error LNK2001: unresolved external symbol “unsigned char * __cdecl readImage(char const *,int *,int *)” (?readImage@@YAPAEPBDPAH1@Z)
fatal error LNK1120: 1 unresolved externals

Since I havn’t been able to load an image before, I have no idea what the problem is. Any help is appreciated.