PDA

View Full Version : nvidia performance lost?



TheGreen
01-23-2009, 01:31 PM
Hi,

I have a little question about nvidia grafic cards and openGl 1.3.
Can it be that nvidia cards having performance lost?
A friend of me have an ATI X700 with 256MB ram (under Linux) and I have a Geforce 8600 mobile with 512 MB ram (under Linux) and on my home pc i have a Geforce 8800 GT with 512 MB ram (unter Windows Vista). The same programm run on the older ATI X700 smoother than on the other two grafics cards. I dont understand it, it is the same programm.

Have nvidia Problems with opengl or must I enable or disable any options?

I hope you understand my problem and can help me.

Thanks a lot.

TheGreen

ZbuffeR
01-23-2009, 02:24 PM
- do you have proper Nvidia drivers for your cards ?
- what program is it ?

TheGreen
01-24-2009, 01:53 PM
Hi,

I have the newest nvidia driver installed (181.22).

Its a self-written programm. But there is not so much to do for the grafic cards. Only a skybox (6x 256x256 textures each texture ~ 100kb), and 4-5 spheres with 32 slices and 32 stacks.

Have you any idea?

The Green

_NK47
01-27-2009, 07:44 AM
have you tested any other OpenGL app/game on your PC?
do you use alot of glFlush/glFinish calls?
sounds weird that it works well on ATI while not on NVidia. most of the time its the other way around.

TheGreen
01-27-2009, 11:45 AM
hi _nk47,

is just use one glFlush after swapBuffers at the end of my paint routine. I call the routine each 15ms.

I have not tested any other game, but glxgears shows me a frame rate with min. 5 digits.

yes I thought ever *buh* bad ati and good nvidia hmm.

thank you for your idea

the green

_NK47
01-28-2009, 07:24 AM
probably not related to the problem itself but using glFlush after swapBuffers is redundant. gfx cards flush the commands queue on buffer swap anyways. let us know what the problems was if you discover it. good luck.

erothrax
01-29-2009, 10:34 AM
Hey,


In the nvidia control panel, under 3D options, try changing it from multi display mode to single display mode
http://img502.imageshack.us/img502/9828/01292009123441kf0.jpg

I heard this helps performance, I'm not sure other then that if you search google with "8800GT opengl problem" you'll see they have some type of bug they didn't fix --I think-- But I have yet to have any problems with opengl and I have 8800GT

TheGreen
02-04-2009, 01:34 AM
Thank you erothax but the problem is going on.
It looks that the grafic card isnīt the problem.

I take the time from one frame and i found the problem.

I trigger the timerfunc of glut with the parameter 10, so that each 10 ms the timerfunc should be trigger and theTimerfunc call a glutPostRedisplay.

But I need for one frame 15.625 ms (vsync is off) because i use a tft display so that is the max. refresh rate.

I also testet the calculate time in the timerfunc und render time in the displayfunc. Booth times are <= 1ms so that i can say the calculate and the render prozess are fast enough.

15.625 ms are ~ 60 frames per second but in an random period time for a frame has exactly the double value 31.25 ms or a frame needs 0 ms.

So I lost in a random period a frame but i donīt know why. I used only one global timefunc and only this timerfunc call one glutPostRedisplay.

Have anyone an idea?

Thanks for answers

The Green

_NK47
02-05-2009, 03:25 AM
is it correct that you use a timer event to draw your scene?
if yes then why?

TheGreen
02-05-2009, 05:24 AM
Hi nk47,

yes we are using a timer event (glutTimerFunc) to draw our scene.
The other otion i know is the idleFunc but this function is not good.

How else can i draw our scene?

_NK47
02-05-2009, 08:32 AM
maybe there are many ways to render i describe how i do it.
you just render everything in an idle function, that is you have an sortof infinite loop which draws all objects jumps to the start and draws them again without ever waiting for something. the time delta you get between current and last frame can be used for all sorts of animation. if one frame gets rendered slower for some reasons then all animated data is still updated in a correct way because your time delta in this case will be higher resulting in farther animation advance. all timer events are not precise enough and honestly i never saw something similar to draw real-time graphics with timer events. going a bit further there are implementations with two threads using one for calculation of your data and the other for solely rendering anything that comes in its way in idle mode.