View Full Version : triangle rendering with geForce

05-03-2005, 12:54 AM

iam currently trying to render about half a million triangles from my marching cubes algorithm..

this seems to work fine whern i used the CPU for rendering using openGL...

but when i installed my GeForce card on to my system something fishy happens...
the surface of the rendered object and its shape remains the same but lot of lines are poking from the surface of the object....

when i rotate the object(i.e., render it ) the lines seem to keep changing at each position....

this is something strange and have no clue what the thing is....

is there any settings i need to do for my card b4 using it for rendering ??

any help on this is highly appreciated...

Thanks in Advance,

05-03-2005, 06:54 AM
pssst, you know you are not posting on the correct forum ? here it is "User, gaming" not Opengl Coding :)

anyway post a screenshot. I would say it looks more like a bug in your code hidden in the software rendering OpenGL but well apparent in a different implementation (or a bug in the drivers).

05-03-2005, 10:39 AM
If I had to do a wild guess, I would say that you are seeing z-fighting, most likely because you are only getting a 16 bit z-buffer. If memory servers correctly the Nvidia driver will fallback to 16 bit if you ask for an 32 bit z-buffer without 8 bit stencil.

On the other hand, artifacts created by incorrect polygon or line smoothing could also fit your description.

I agree with Zbuffer that a screenshot would be most helpful to troubleshoot your problem.
I also agree with him that this isnt the right board, you really want to ask this over at the "OpenGL coding beginner" board on this site.