Lighting glitch - yet another fishtank question.

I guess fishtanks are the latest fashion thing right now…

I’m making one as a class project as well and I would like to ask a question about lighting…

I’m loading a 3ds model of a fish (particularly, a Turquoise Discus model) and setting up a few lights.

The glitch appears when I move far enough from the model and only on the fins - the lighting on them sort of quivers like in the picture.

The fins don’t look like a flat surface (although I’m not sure at all, I don’t have a 3D editor to confirm or deny it) and the normals seem to be fine since the light’s generally correct on both sides of the fins when I move in closer.

With texturing off this issue doesn’t change at all.

Laptop configurations:
1. Lenovo X60S, Core2Duo 1.6, and, most importantly, 
Intel GMA 950 integrated graphics.
2. Acer Aspire One AOA150, single-core 1.6 Ghz, 
same graphics.

A friend of mine said she didn’t notice the quivering (she has a normal graphics card), so there is a chance GMA’s to blame, but I tend to believe in Intel more than in my coding abilities…

What could be the possible cause of the problem? Should I provide any additional information?

Thank you very much in advance, I apologize for bothering you with stupid questions like this one.

You should try it on a ‘real’ graphics card and see for yourself. I have learned that it is not very wise to trust Intel when it comes to OpenGL.

Your screenshot is … hum … not very clear, but it is probably a case of z-fighting : depth buffer precision is mostly packed near the camera, and further away it is less and less precise. So depth test can not decide reliably what is in front and what is behind.

More detailed explanation :
http://www.sjbaker.org/steve/omniv/love_your_z_buffer.html

The difference between video cards may be caused by not asking explicitely for a 24 bits z-buffer. Intel GMA may default to only 16 bits, which means very low depth precision, and another card may default to 24 bits. This is not really wrong on the Intel side, but more a case of ‘defaults should never be trusted’.

How to ask depends on the toolkit you use to init the GL framebuffer : glut, win32, … ?

Thank you very much for the answers.

I’m using GLUT to initialize OpenGL.

Changing depth buffer settings to 24-bit helped, removed other annoying glitches as well, thank you!