Sucky OpenGL on Mac

Okay. I have had several problems with OpenGL/GLUT on MacOS Classic:

GL_POINT_SMOOTH has no effect
glDrawPixels() is f#@$ed up
glutBitmapLength() does nothing
so do glutSetCursor() and glutWarpPointer()

Can anyone help with this? I can work around glutBitmapLength(), but glDrawPixels() is literally a show-stopping glitch.

GL_POINT_SMOOTH has no effect

I am afraid there is no by-pass for this point for Rage Pro/128. For other cards i don’t know…

Since I don’t use GLUT, I will be of no help for your GLUT issues.

glDrawPixels should work fine. Does your code work on other platforms?

glutBitmapLength has precisely the same source as the X11 version IIRC, and should therefore work identically.

The code to make glutWarpPointer work is in the source code distribution. It was disabled because (a) it’s against the HI guidelines and (b) it doesn’t work in Carbon, only classic. You should be able to recompile GLUT with that enabled.

If you’re willing to add resources to your project, it should be relatively easy to concoct a glutSetCursor – IIRC, there’s a simple Toolbox call to set the cursor to the contents of a CURS resource.

blending must be disabled for GL_POINT_SMOOTH to work. You will also need a blend function.

I’m not a mind reader so I can’t help with your Draw Pixels problem.

When I try to use glDrawPixels, it draws some of the pixels, but none of them in the right place. Also, it makes up some totally new pixels and draws them too.

Now that I’ve switched to GL_TEXTURE_2D, I have even more problems. Now instead of being garbled, by images are ignored! It only draws white squares!

Here’s the drawing code:
glColor4f(1.0, 1.0, (missiles > 0 &&
daBaddie->pos[0] < xpos + BADDIE_IMAGE_SIZE &&
daBaddie->pos[0] > xpos - BADDIE_IMAGE_SIZE) ? 0.0 : 1.0, 1.0);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, BADDIE_TEXTURE(lrmult + 1));
glBegin(GL_QUADS);
glTexCoord2f(0.0, 1.0);
glVertex2i(daBaddie->pos[0] - BADDIE_IMAGE_SIZE / 2,
daBaddie->pos[1] - BADDIE_IMAGE_SIZE / 2);
glTexCoord2f(1.0, 1.0);
glVertex2i(daBaddie->pos[0] + BADDIE_IMAGE_SIZE / 2,
daBaddie->pos[1] - BADDIE_IMAGE_SIZE / 2);
glTexCoord2f(1.0, 0.0);
glVertex2i(daBaddie->pos[0] + BADDIE_IMAGE_SIZE / 2,
daBaddie->pos[1] + BADDIE_IMAGE_SIZE / 2);
glTexCoord2f(0.0, 0.0);
glVertex2i(daBaddie->pos[0] - BADDIE_IMAGE_SIZE / 2,
daBaddie->pos[1] + BADDIE_IMAGE_SIZE / 2);
glEnd();
glDisable(GL_TEXTURE_2D);

and here’s the setup code:int n;
// glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);
glGenTextures(IMAGE_COUNT * 2 + 2, (unsigned long*) textures);
for(n = 0; n < IMAGE_COUNT; n++) {
glBindTexture(GL_TEXTURE_2D, BADDIE_TEXTURE(n));
glTexImage2D(GL_TEXTURE_2D, 0, 4, BADDIE_IMAGE_SIZE, BADDIE_IMAGE_SIZE, 0, GL_RGBA, GL_UNSIGNED_BYTE,
baddies[n]);
glBindTexture(GL_TEXTURE_2D, GOODIE_TEXTURE(n));
glTexImage2D(GL_TEXTURE_2D, 0, 4, MAIN_IMAGE_SIZE, MAIN_IMAGE_SIZE, 0, GL_RGBA, GL_UNSIGNED_BYTE,
goodies[n]);

I know it’s rather hairy, but this is my first attempt at a color game using a non-HyperCard API.

How do you load the textures from disk?

What does lrmult + 1 equal.

What memory does baddies[n] & goodies[n] point to?

You are creating an array of texture IDs in (unsigned long*) textures but your are binding to texture IDs in BADDIE_TEXTURE(n) & GOODIE_TEXTURE(n)

I’ve just noticed that i stuffed up previously…

blending must be ENABLED for GL_POINT_SMOOTH to work

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.