Wierd GL Problem on GeForce and Software Rendering

I have a wierd problem with the GL terrain engine I am programming.
Everything works perfectly with my Voodoo3 2000 AGP.
With a TNT2 everything is cool until lighting is applied to
the trees and gun model. Everything that is lit is just a black
silhouette.
With a GeForce2, and any non-hardware accelerated card, the terrain and
trees do this funny corss-hatching effect that fades in and out,
as though the depth buffer is not being cleared properly.
Why with opengl being such a standard and all, would things be so
problematic on different cards.

I realise I could be coding wrong (any ideas - I’m stressing)
but why would everything work great on some cards and completely
wrong on others. Because everything works as I have coded it too work
on my pc - why would it be different on others?

I can email you screen shots if you want.

Please help - I’m desperate.
Chris

Two guesses that come to mind:

turn back face culling on and
apply material colors (glMaterial)?

You are using surface normals for the lighting & they are pointing in the right direction ?

One more guess comes to mind, is your light positioned behind the polygons?

Everything is as it should be - the terrain
is lit manually with a pre-process so thats not the problem - and the fact that it works cool on the majority of 3d cards means that either the GeForce’s Drivers are screwed up or every other card is getting it wrong!
Give me your mail addresses and I can send screenshots if you want.

Try using a 32bit z buffer on the GeFORCE cards. Looks better, and there’s no evident performance hit. Also try rendering the scene twice, one with a near/far of 1 and 100 and the next with 100/(really high number). Unless, of course, it’d really chew up your FPS…?

Done that already - Thanks anyway - any other suggestions?

Chris