I have a little question about nvidia grafic cards and openGl 1.3.
Can it be that nvidia cards having performance lost?
A friend of me have an ATI X700 with 256MB ram (under Linux) and I have a Geforce 8600 mobile with 512 MB ram (under Linux) and on my home pc i have a Geforce 8800 GT with 512 MB ram (unter Windows Vista). The same programm run on the older ATI X700 smoother than on the other two grafics cards. I dont understand it, it is the same programm.
Have nvidia Problems with opengl or must I enable or disable any options?
I have the newest nvidia driver installed (181.22).
Its a self-written programm. But there is not so much to do for the grafic cards. Only a skybox (6x 256x256 textures each texture ~ 100kb), and 4-5 spheres with 32 slices and 32 stacks.
have you tested any other OpenGL app/game on your PC?
do you use alot of glFlush/glFinish calls?
sounds weird that it works well on ATI while not on NVidia. most of the time its the other way around.
probably not related to the problem itself but using glFlush after swapBuffers is redundant. gfx cards flush the commands queue on buffer swap anyways. let us know what the problems was if you discover it. good luck.
I heard this helps performance, I’m not sure other then that if you search google with “8800GT opengl problem” you’ll see they have some type of bug they didn’t fix --I think-- But I have yet to have any problems with opengl and I have 8800GT
Thank you erothax but the problem is going on.
It looks that the grafic card isn´t the problem.
I take the time from one frame and i found the problem.
I trigger the timerfunc of glut with the parameter 10, so that each 10 ms the timerfunc should be trigger and theTimerfunc call a glutPostRedisplay.
But I need for one frame 15.625 ms (vsync is off) because i use a tft display so that is the max. refresh rate.
I also testet the calculate time in the timerfunc und render time in the displayfunc. Booth times are <= 1ms so that i can say the calculate and the render prozess are fast enough.
15.625 ms are ~ 60 frames per second but in an random period time for a frame has exactly the double value 31.25 ms or a frame needs 0 ms.
So I lost in a random period a frame but i don´t know why. I used only one global timefunc and only this timerfunc call one glutPostRedisplay.
maybe there are many ways to render i describe how i do it.
you just render everything in an idle function, that is you have an sortof infinite loop which draws all objects jumps to the start and draws them again without ever waiting for something. the time delta you get between current and last frame can be used for all sorts of animation. if one frame gets rendered slower for some reasons then all animated data is still updated in a correct way because your time delta in this case will be higher resulting in farther animation advance. all timer events are not precise enough and honestly i never saw something similar to draw real-time graphics with timer events. going a bit further there are implementations with two threads using one for calculation of your data and the other for solely rendering anything that comes in its way in idle mode.