PDA

View Full Version : Weird behaviour with lighting and display lists...



ahkcasfm
05-05-2009, 01:23 AM
I'm using OpenGL in C++ (with GLUT). I've got a 256x256 matrix of vertices (generated using plasma fractals algorithm), with various colors depending on mountain, grass, water, etc.

When I run on my home computer (Dual core 3.16GHz, 2GB RAM, 512MB graphics card), the behavior is as follows:
- No specular lighting: Good performance
- Specular lighting without display lists: Good performance
- Specular lighting with display lists: Terrible! (about 15fps)

Running the exact same code on machines at my uni (which are NOWHERE NEAR as good as my home one), i get:
- No specular lighting: Good performance
- Specular lighting without display lists: Pretty laggy
- Specular lighting with display lists: Good performance

I guess the main thing I wanna know is how the hell using a display list could worsen performance on my home PC??? They're supposed to be really efficient!

Please help

ZbuffeR
05-05-2009, 01:49 AM
You did not tell what is your graphic card make and model ?
Try without any display list, to compare.

ahkcasfm
05-05-2009, 02:15 AM
Yeah sorry, it's a GeForce 8400 GS

I already did it without display lists, and like I said, that improved things. I have no idea how or why though.

And on the worse machines at my uni (not sure of the exact specs), display lists improve performance a LOT.

Rosario Leonardi
05-05-2009, 05:48 AM
Are you using windows? Did you update the video card drivers in your home computer?

ahkcasfm
05-05-2009, 09:35 PM
Yep, windows. Pretty sure I already had the latest drivers, but I re-installed just in case, and no change.

It's really the display list thing that's getting to me, I thought all display lists did was remove non-openGL commands and keep the openGL ones (and possibly optimise them), hence performance is always equal or better

scratt
05-05-2009, 10:14 PM
Are you tracking GL errors when you run the app in different modes / on different machines?

Also perhaps grab a copy of gDEBugger (you can get a 30 days trial) and get it to profile your code and see if you can see anything weird happening.. Big memory swaps or something else.. Although with a 512MB GPU that seems unlikely.

Just for future reference, you would be much better going with VBOs instead of DLs as DLs are on the way out (re. 3.x deprecation). Changing over to VBOs would be relatively painless, much more future proof, and the time taken will most likely be the same (or less) as that taken tracking down the cause of the problem with the DLs.