PDA

View Full Version : Application is stuttering on every system...



fgreen
01-29-2010, 05:41 AM
Hi there,

I have developed an application that should be used as a part of a research project. The application shows some spheres rolling along within a room. For this research project it is really important that all objects are moving very smooth.

My problem is that the objects within the scene are all stuttering every some seconds and I have no clue why. As we were not able to find any issues about the code I wrote a simple application just rotating a single triangle on the screen. I have tried this on a GLUT and a SDL rendering canvas and I could see the same behavior in this application. The triangle seems to get a little slower or faster for some fraction of a second and continues normally. I tried the application on several systems including Windows and Linux on different graphic cards (all nVidia devices) and I could see the same behavior on every system. I even tried it on a NVIDIA's Quadro FX 5800 graphics card where it run (vsync disabled) at about 10.000 fps. I have tried to enabe or disable the vsync but it did not change anything.

We excluded CPU switches as the problems source by fixing the process to a single core. The system we used for testing does not have any SpeedStepping or other stuff enabled. As we did not trust in the real time clock on the system we have even read the CPU steps between two frames directly from the CPU which hasn't had any effect on the problem.

As I told before the vsync on the system was enabled. To exclude this as a problem we also disabled the vsync which did not change anything. We sampled the number of CPU cycles between two frames but we were not able to recognize any major variations. All variations were smaller than 4 ms on all systems.

So I have no idea where to search for a solution anymore. Anyone here having any idea why I get those stuttering movements?

Jan
01-29-2010, 06:38 AM
The amount of movement of you objects IS proportional to the time that has passed, no? Your descriptions sounds, as if you were doing fixed movement-steps each frame which yields to varying movement when the frame-rate isn't stable.

Maybe a video could help to show the exact behavior.

Jan.

fgreen
01-29-2010, 08:39 AM
The amount of movement of you objects IS proportional to the time that has passed, no? Your descriptions sounds, as if you were doing fixed movement-steps each frame which yields to varying movement when the frame-rate isn't stable.

Maybe a video could help to show the exact behavior.

Jan.

No, I do of course not use fixed movement steps per frame but use the system time in milliseconds as the base for the object movements. The other thing I tried is reading the number of CPU cycles between two rendering passes and devide it by the number of CPU cycles per second the machine performs.

So, in my eyes I use a very exact method for timing the object movements. I use the time within kinematic equitations which means, that I calculate the objects position absolute from their origin at scene start.

I have uploaded a copy of my current testing code to http://download.gadgetweb.de/SdlTest/SdlTest.tar or you can just take a look at the .c file at http://download.gadgetweb.de/SdlTest/main/main.c. I have played around with the code a little bit so it might contain currently some flaws but the stuttering effects are still there.

Edit: A video of this problem would be useless as this effects appear over a time of a fraction of seconds which means that encoding the captured data to a downloadable size would destroy this effects.

ZbuffeR
01-29-2010, 08:49 AM
Try to stick a glFinish() right after SDL_GL_SwapBuffers();
It lowers performance, eats 100% cpu for almost nothing ... but reduced stuttering each time I used it.
glFlush() is less extreme, but somewhat less useful.

fgreen
01-29-2010, 10:01 AM
Try to stick a glFinish() right after SDL_GL_SwapBuffers();
It lowers performance, eats 100% cpu for almost nothing ... but reduced stuttering each time I used it.
glFlush() is less extreme, but somewhat less useful.

Thank you for your advice. As I am rendering only a very low number of vertices and have vsync enabled this is not the issue. I added a glFinish() but the problem does still exist.

To me it seems that it more likely is a problem with the driver or graphics hardware. I colleague of mine tested the application using software rendering and it seems not to show the stuttering anymore.

I have not tried to run the application on ATI hardware yet, so I will try so find a way to access a system with an ATI instead of a nVidia card.

Nosedog
01-29-2010, 05:52 PM
Your test program works perfectly fine, without stuttering, as it is on my system. Are you using the latest SDL, version 1.2.14? I know they fixed a lot of speed problems on Windows Vista/7.

fgreen
01-30-2010, 02:51 AM
Your test program works perfectly fine, without stuttering, as it is on my system. Are you using the latest SDL, version 1.2.14? I know they fixed a lot of speed problems on Windows Vista/7.

Oh, I should have told that I tested it on Windows XP and Windows 7. I am also using the latest SDL version but it definitely is no SDL issue as it occurs under GLUT and plain GLX as well. Are you really sure that it is running all smooth. The stuttering is really hard to see. You can see it looking at the edges of the cube.

Which graphic card do you use?

The actual application has been under development for about four month now and I have not realized the stuttering during this period of time. I have tested the application using a stereo projection and headtracking during this period of time. The headtracking increased the motion complexity and it becomes nearly unfeasible to see the stuttering. But I recognized it when I used a normal two dimensional projection as the range of the stuttering was much bigger than on a normal monitor.

We even tried to use the GLX based version of the first tutorial examples at http://nehe.gamedev.net/data/lessons/lesson.asp?lesson=04. After changing the axis of the rotation to z and setting down the speed we were able to see the same stuttering. I even see this effects in games with simple trajectories and no camera movement like supertux. I cannot tell if this also occurs in complex commercial 3D games as they use complex camera movements which makes it impossible to see this effect.

All I really know about this problem is that it is not an issue of SDL, GLUT or any Library, and should be no CPU or System timing issue and that it also exists on high performance graphic systems.

Edit: Sorry, I use the latest SDL.

CatDog
01-30-2010, 05:13 AM
Two days ago, it took me half an hour to realize, that my suddendly badly stuttering application was thwarted by Google Chrome diligently displaying the Burger-King Flash webpage in the background, which had been eagerly visited before and wasn't closed.

Burgers are a scourge of humanity.

CatDog

fgreen
01-30-2010, 08:05 AM
Two days ago, it took me half an hour to realize, that my suddendly badly stuttering application was thwarted by Google Chrome diligently displaying the Burger-King Flash webpage in the background, which had been eagerly visited before and wasn't closed.

Burgers are a scourge of humanity.

CatDog

Hehe, okay...
I wish fixing my problem would be as easy as closing a browser window ;)

This stuff is driving me crazy. Well, I can see the same stuttering in nexuiz if I do not touch the mouse and walk straight forward. Does anyone know if there are any issues with the nVidia implementation of OpenGL? May be I should test the same stuff using Direct 3D to see if I get the same problems then.

mark ds
01-30-2010, 11:56 AM
I have no experience with SDL whatsoever, but I wonder if the culprit is the resolution of SDL_GetTicks? If it uses GetTickCount internally, maybe try timeBeginPeriod(1) and then timeEndPeriod(1).

Better still, try QueryPerformanceCounter in windows.


Also, have you verified in taskmanager that no background processes are chewing up cpu cycles?

fgreen
01-30-2010, 01:58 PM
I have no experience with SDL whatsoever, but I wonder if the culprit is the resolution of SDL_GetTicks? If it uses GetTickCount internally, maybe try timeBeginPeriod(1) and then timeEndPeriod(1).

Better still, try QueryPerformanceCounter in windows.


This was my first idea. I have tried most stuff to fix this issue on linux as I have more possibilities to see what the system is actually doing.

The timers in Windows have a resolution of 16 ms which means that you should not see any stuttering. Anyway: As I told I totally mistrust the system timers so under Linux I used some inline assembly to count the CPU ticks between the two calls of my rendering method. As the system does not have any SpeedStepping or other stuff changing the CPU frequency during runtime this should be the best way to get exact timing information.



Also, have you verified in taskmanager that no background processes are chewing up cpu cycles?

Yes, I have done this. There are no other processes running that consume such many CPU cycles that it should lead to stuttering. And the Linux system I have used during my last tests is a Xeon 3.2 GHz Dual Quadcore system and I reserved a single core just for my rendering application and used that special core to receive its number of CPU cycles for my timing calculation. I just can't be an issue with my time values I measured. The method I use to read the cycles has been used in an other research project that needs really exact time values in a range about 100 ns for force feedback calculations. So this method has excessively been tested and used before. Btw.: In feedback applications they need a force resolution of about 1 kHz as the touch receptors in our body do react within this frequency which means that it is much faster than the 25 Hz our eyes do and they do not have any stuttering effects with the time values measured this way.

zed
01-31-2010, 04:44 PM
some nvidia cards have an issue with severe stuttering performance if u have nview enabled

This problem occurs both with opengl + d3d programs

if u have that enabled, disable it and rerun your program

yooyo
01-31-2010, 05:38 PM
In some cases OS kernel can change CPU core (in multicore CPU's) which execute your app. Couple years ago there was an issue with this because cores in multicore CPU didn't have synced their clocks. Even negative delta time can occur.

There is fix for XP. This issue is resolved in newer mobo BIOS.

fgreen
02-01-2010, 01:31 AM
some nvidia cards have an issue with severe stuttering performance if u have nview enabled

This problem occurs both with opengl + d3d programs

if u have that enabled, disable it and rerun your program

No, I do not use nView on any system.


In some cases OS kernel can change CPU core (in multicore CPU's) which execute your app. Couple years ago there was an issue with this because cores in multicore CPU didn't have synced their clocks. Even negative delta time can occur.

There is fix for XP. This issue is resolved in newer mobo BIOS.

To exclude any timing differences or delays on CPU core switches we forced the application to run on only one single specific core. And the problem is not related to Windows XP but exists under Linux as well.

marshats
02-02-2010, 01:05 PM
I modified your code to use glut instead of SDL but still uses its low-resolution timer. I do not see any stuttering with this. Maybe you could recompile this on your machine as a test of your GL drivers



gcc main.c -lGL -lglut

Does it still stutter? The reason I tried this is to separate your GL drivers from your window tool.

BTW, what does glewinfo report back to you about the driver being used?



/*
* main.c
*
* This application may be used and modified in the terms
* of the GPL v2 license.
*
* Created on: 26.01.2010
* Author: Falk Garbsch
*/

#include <GL/glut.h>
#include <GL/glu.h>
#include <GL/gl.h>
#include <stdio.h>
#include <math.h>

#include <sys/time.h>
#include <stdint.h>
#include <unistd.h>

double angle;
double translate;
long ltime;
double ntime;
long count;

//FILE* fout;
//long tfcnt;

void prepare() {
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glFrustum(-0.5, 0.5, -0.375, 0.375, 1.0, 100.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(0, 0, -5, 0, 0, 0, 0, 1, 0);
glEnable(GL_DEPTH_TEST);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_ACCUM_BUFFER_BIT);
}

void drawcube() {
glBegin(GL_TRIANGLES);
glColor3f(1.0, 0.0, 0.0);
glVertex3f(-0.5, -0.5, -0.5);
glVertex3f(0.5, -0.5, -0.5);
glVertex3f(-0.5, 0.5, -0.5);

glVertex3f(0.5, -0.5, -0.5);
glVertex3f(-0.5, 0.5, -0.5);
glVertex3f(0.5, 0.5, -0.5);

glColor3f(0.0, 1.0, 0.0);
glVertex3f(-0.5, -0.5, -0.5);
glVertex3f(0.5, -0.5, -0.5);
glVertex3f(-0.5, -0.5, 0.5);

glVertex3f(0.5, -0.5, -0.5);
glVertex3f(-0.5, -0.5, 0.5);
glVertex3f(0.5, -0.5, 0.5);

glColor3f(0.0, 0.0, 1.0);
glVertex3f(-0.5, -0.5, -0.5);
glVertex3f(-0.5, 0.5, -0.5);
glVertex3f(-0.5, -0.5, 0.5);

glVertex3f(-0.5, 0.5, -0.5);
glVertex3f(-0.5, -0.5, 0.5);
glVertex3f(-0.5, 0.5, 0.5);

glColor3f(0.0, 1.0, 1.0);
glVertex3f(0.5, -0.5, -0.5);
glVertex3f(0.5, 0.5, -0.5);
glVertex3f(0.5, -0.5, 0.5);

glVertex3f(0.5, 0.5, -0.5);
glVertex3f(0.5, -0.5, 0.5);
glVertex3f(0.5, 0.5, 0.5);

glColor3f(1.0, 0.0, 1.0);
glVertex3f(-0.5, -0.5, 0.5);
glVertex3f(0.5, -0.5, 0.5);
glVertex3f(-0.5, 0.5, 0.5);

glVertex3f(0.5, -0.5, 0.5);
glVertex3f(-0.5, 0.5, 0.5);
glVertex3f(0.5, 0.5, 0.5);

glColor3f(1.0, 1.0, 0.0);
glVertex3f(-0.5, 0.5, -0.5);
glVertex3f(0.5, 0.5, -0.5);
glVertex3f(-0.5, 0.5, 0.5);

glVertex3f(0.5, 0.5, -0.5);
glVertex3f(-0.5, 0.5, 0.5);
glVertex3f(0.5, 0.5, 0.5);
glEnd();
}

void render() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

glPushMatrix();
glRotatef(angle, 0, 0, 1);
drawcube();
glPopMatrix();

glutSwapBuffers();
}

void animate() {
/* double tstep = stop_timing();
ntime += tstep;
start_timing();*/
double ntime = glutGet(GLUT_ELAPSED_TIME);
double fps;
//tfcnt++;
//fprintf(fout, "%d %f\n", tfcnt, tstep);

if (ntime - ltime > 5000) {
fps = (double)count * 1000.0 / (double)(ntime - ltime);
printf("fps: %f\n", fps);
ltime = ntime;
count = 0;
}
count++;

angle = (double)ntime / 7000.0 * 360.0;
translate = sin((ntime / 5000.0)) * 3.0;

glutPostRedisplay();
}

int main(int argc, char** argv) {
count = 0;
angle = 0;
translate = 0;
ntime = 0;
//tfcnt = 0;

glutInit(&amp;argc, argv);
glutInitDisplayMode(GLUT_DOUBLE|GLUT_DEPTH);
glutCreateWindow("Multipass texturing Demo");
glewInit();

glutIdleFunc(animate);
glutDisplayFunc(render);
//glutKeyboardFunc(keyboard);

prepare();

glutMainLoop();
return 0;
}

marshats
02-03-2010, 08:04 AM
Also I compiled and ran your SDL code on my Ubuntu 9.10 32bit ASUS 9600GT with NVIDIA 195.17 binary driver installed. And just like the glut code in my previous post it ran fine without any stuttering effects at all.

What is your hardware setup and which GL driver do you have installed in Linux? MESA? NVIDIA 19?.???

Chris Lux
02-03-2010, 08:33 AM
maybe some desktop compositing enabled (compiz etc.)?

fgreen
02-09-2010, 09:36 AM
Sorry, I have been busy for the last days...

First I would like to thank you for your replies.

It should not be any composite issue as Windows does not have any composite extensions and I have tested it on a clean Xorg server even without any window manager activated as well.

We have used the lesson 4 at http://nehe.gamedev.net/ and added some high performance counter for frame timing. The stuttering was the same using this demo application on Windows XP, Windows Vista and Windows 7. A colleague of mine told me that it was not stuttering on Windows 7 on the nVidia Quadro FX 5800 system. As I checked it yesterday I could clearly see the same issue. The stuttering happens not periodically and it is very hard to see during slow or fast motions. Most of the people I asked told me that the application was not stuttering at the first place and needed some time recognize the effects. It is much easier to see on a projection with the width of about 2 or 3 meters than on a small display.

May be calling it micro stutters (I found the term somewhere on the web) would be better. I is far better to see on linear movements. As we could see the issue on every system we used we have contacted nVidia to see what they say about this stuff. I will place a link to the modified nehe demo (Windows version only) as soon as possible. We changed the rotation against some linear translation as we figured out that it is very hard to see the stuttering on rotating objects.


Also I compiled and ran your SDL code on my Ubuntu 9.10 32bit ASUS 9600GT with NVIDIA 195.17 binary driver installed. And just like the glut code in my previous post it ran fine without any stuttering effects at all.

What is your hardware setup and which GL driver do you have installed in Linux? MESA? NVIDIA 19?.???

As I told, we tried lots of systems. Our high performance rendering system contains the following hardware:
2x Xeon QUAD (4x3.2 GHz)
24 GB RAM (Tripple Channel)
nVidia Quadro FX 5800 (4 GB RAM + 3 GB Shared)

It uses the latest Fedora Core or Windows 7. I will look up the driver version the next days but it should be one of the 18?.?? versions. We do of course not use any MESA rendering.


I modified your code to use glut instead of SDL but still uses its low-resolution timer. I do not see any stuttering with this. Maybe you could recompile this on your machine as a test of your GL drivers


We used glut before I changed it to SDL. As I told before the timer is not the problem. The start_timer(), stop_timer() methods I have commented out use in the source are methods using the CPU cycle counter for timing which is mostly the same as QueryPerformanceCounter in Windows does.

I will see if I can post a full system and driver configuration of our system but I do not think that this is a software or system configuration issue anymore.

marshats
02-10-2010, 07:52 AM
SDL, GLUT no difference to me. Since I saw no issue on SDL code thought I would explore a new data point with GLUT :)

I tried staring at your SDL code running again for a minute or two -- got dizzy -- still seeing no "micro stuttering". I am only seeing a FPS of "fps: 59.952038". What number are you getting when you see micro stuttering for FPS? Is it possible that I am just synced to my monitor refresh rate whereas you are not in your particular test setups?

ps just curious, if you change in animate()


angle = (double)ntime / 7000.0 * 360.0;
to
angle = (double)((Uint32)ntime % 7000) * 360. / 7000.0;

does this help?

ZbuffeR
02-10-2010, 09:51 AM
No visible stuttering at all with the above posted code. And I find myself quite sensitive to this sort of things.

Do you have update code which would better demonstrate the problem ?

fgreen
02-11-2010, 06:14 AM
I tried staring at your SDL code running again for a minute or two -- got dizzy -- still seeing no "micro stuttering". I am only seeing a FPS of "fps: 59.952038". What number are you getting when you see micro stuttering for FPS? Is it possible that I am just synced to my monitor refresh rate whereas you are not in your particular test setups?


I see about the same rate as you do. It is perfectly synced against the vertical refresh and is is not a flickering problem or something like this.



ps just curious, if you change in animate()


angle = (double)ntime / 7000.0 * 360.0;
to
angle = (double)((Uint32)ntime % 7000) * 360. / 7000.0;

does this help?

Have done this before and it does not change anything. I even used some linear movements which has the same effects.


No visible stuttering at all with the above posted code. And I find myself quite sensitive to this sort of things.

Do you have update code which would better demonstrate the problem ?

I have some code and I will upload it later. The problem is that I only have a windows version of the code, so I might have to create a linux version as well.

macarter
02-11-2010, 10:11 AM
Reading the clock in animate is conceptually flawed. The angular position should be based on the time the image is displayed, not when it is computed. See if vsync with a fixed time delta removes the "micro stuttering".

Low resolution of the glutGet(GLUT_ELAPSED_TIME) could also contribute to the problem. Despite your claim, sixteen milliseconds is not nearly enough resolution.

macarter
02-11-2010, 11:21 AM
I have seen some flat panel displays and projectors that cause stuttering. They run internally at a fixed rate that differs from the video input signal. They drop or interpolate frames. I have never seen a CRT display with the problem.

fgreen
02-14-2010, 05:45 AM
Reading the clock in animate is conceptually flawed.

Don't think so as I it is called directly before rendering. I also called the animate method from within the rendering method. This is only a flaw if the animate method is called asynchronous which not even GLUT does.


Low resolution of the glutGet(GLUT_ELAPSED_TIME) could also contribute to the problem. Despite your claim, sixteen milliseconds is not nearly enough resolution.

As I told I even used the number of CPU cycles between two rendering passes to get the exact time elapsed. On a 3.2 GHz you would easily get a microseconds resolution (or even nanoseconds). Weather the timing nor the animation method should be the source of the stuttering.


I have seen some flat panel displays and projectors that cause stuttering. They run internally at a fixed rate that differs from the video input signal. They drop or interpolate frames. I have never seen a CRT display with the problem.

Could be worth it to try it on a CRT with a VGA input. Such devices should not be even able to interpolate any images. But it would be a kind of horror scenario if our infitec beamers interpolate pictures ;)

I have uploaded a windows demo at http://download.gadgetweb.de/lesson4/lesson04.zip . It is a modified version of the lesson 4 of http://nehe.gamedev.net that just moves a quad from right to left using windows QueryPerformaceCounter method. Because of using this method the application will stutter on SpeedStepping-Systems which is not the stuttering I want to get rid of. So, if you want to test the application just make sure to disable all SpeedStepping features before.

Stephen A
02-14-2010, 09:38 AM
Ok, I modified this test to enable vsync via wglSwapIntervalEXT and ran it on my Win7/85Hz CRT/Ati 4850/SpeedStep-enabled machine. Result: the only stuttering I can see is caused by the quad moving on non-integer pixel coordinates. Otherwise, the result is buttery-smooth.

macarter
02-15-2010, 04:50 PM
I have seen fan speed controls and Intel Ethernet drivers steal milliseconds of CPU cycles on a periodic basis. A few milliseconds of missing time may not cause a dropped frame in your glut application but it may disturb the timing of your animation. Since the video sync timing is fixed your animation delta time should also be fixed. Try logging the animation delta time to a file. Timing variations of a couple of milliseconds are detectable to the eye, dropped frames even worse.

def
02-17-2010, 04:02 AM
Just to prove your sanity, I could reproduce the studdering with your windows demo.
The effect occured every few seconds (6 - 10 secs).
Running on both an Intel quad core desktop system and a 2x dual core Opteron server system using a single GTX 285.

Both systems are used extensively for broadcast video graphics, SD and HD and there is definitely no studdering in our applications.

So it cannot be a hardware/driver/system problem, I think.

Most of our tools use QueryPerformanceCounter for timing, but I need to double check.

Why don't you output the timing deltas you calculate as well as the translation deltas per frame. Being synced to vblanc the "studders" should be obvious in the numbers aswell.

fgreen
02-21-2010, 03:51 AM
Just to prove your sanity, I could reproduce the studdering with your windows demo.
The effect occured every few seconds (6 - 10 secs).
Running on both an Intel quad core desktop system and a 2x dual core Opteron server system using a single GTX 285.

[...]

Why don't you output the timing deltas you calculate as well as the translation deltas per frame. Being synced to vblanc the "studders" should be obvious in the numbers aswell.

We have already done this. We wrote all timedifferences between two frames to a file and checked for unusual values. We even plotted the values using gnuplot. The maximum deviation w/wo vsync was smaller than 30ns.

I did some sanity checks on my own code and have had it reviewed by some collegues. The windows application I uploaded has even not been written by myself. I just modified it to use the QueryPerformanceCounter. And it also stutteres if I just use the basic application without any QueryPerformanceCounter of timer query calls with a fixed rotation or movement using the synced to vblanc as timing basis.

fgreen
02-21-2010, 03:53 AM
Ok, I modified this test to enable vsync via wglSwapIntervalEXT and ran it on my Win7/85Hz CRT/Ati 4850/SpeedStep-enabled machine. Result: the only stuttering I can see is caused by the quad moving on non-integer pixel coordinates. Otherwise, the result is buttery-smooth.

Uhm... so I really have to find a system using ATI-hardware for rendering to check it. If I am able to get it running smooth on ATI-hardware using an other graphichs card would be a first workaround.

MaxH
02-22-2010, 01:18 PM
FWIW. Last night I wrote a simple GL program in connection with another thread on the forum and, lo and behold, got stuttering. Never seen it before. In my app. a circular region is supposed to move with constant speed, bouncing when it hits the edge of the window. It's a very simple, 2D application. The motion is not smooth. About every second, it halts and jumps a little. I'm on an ATI Radeon X300 card and Windows XP. For me the stuttering goes away if I disable hardware acceleration, or if I enable vertical sync. I know you said that enabling vertical sync didn't help. Are you sure that it remains enabled when you run your application? Tomorrow, when I get back to work I'll try my app on an NVidia card. Will be interesting to see if I still get the stuttering.

fgreen
03-09-2010, 03:59 AM
I'm on an ATI Radeon X300 card and Windows XP. For me the stuttering goes away if I disable hardware acceleration, or if I enable vertical sync.
[...]
Are you sure that it remains enabled when you run your application?


Absolutely sure as enabling vsync drops the frame from over 10.000 fps down to about 60 fps. The SDL version of my application also shows weather it was able to enable using the vsync or not and setting vsync returns a positive response.

peterfilm
03-10-2010, 06:28 AM
i can report serious stuttering on the quadro line of cards at the moment. This is most evident under extremely heavy API load. In other words, when lots of uniforms are being updated for many batches, when the batches are in display lists. When I say serious, I mean 5 second stalls when the load suddenly increases (i.e. when you turn the camera to face the model and there's a sudden jump in API calls in a frame). Same thing happens when the load is reduced suddenly (i.e. turning the camera away from a heavy model).
After the stall it runs smoothly, until the load is changed suddenly again. It happens constantly, not just when the model is first viewed. It's to do with change in work load (presumably CPU workload in the driver thread). What is more telling is that during the stalls there's a constant stream of 100's of page faults shown in task manager for the application rendering the model. When the stalling stops, the page faults stop.
5 second stalls are not acceptable. This occurs on Quadro cards from the 3500 up to the 5800, on both XP (32 and 64 bit) and Vista/w7, on dual and quad core CPU's, on 2GB memory and 8GB memory systems. It happens everywhere. It also happens in other third party applications rendering the same kind of scenes.

Dark Photon
03-10-2010, 08:43 AM
...5 second stalls when the load suddenly increases (i.e. when you turn the camera to face the model...
Check the amount of texture, VBOs, etc. you're using. If you're blowing past the amount that comfortably fits in GPU memory then such stalls are expected, particularly when you turn the camera. Though 5 sec sounds a bit long. Also, you do know you have to prerender to force the textures/etc. onto the card, right?

This may be the dynamic shader recompilation when uniforms change annoyance, but that was mostly a pre-GeForce 8 thing IIRC.

BTW, shouldn't this have been a new thread?

peterfilm
03-10-2010, 09:13 AM
no textures, no VBO's, memory not the issue (one of the cards has 4GB onboard, but doesn't matter anyway as the biggest scene i've tested is only 250MB in size).
The bit about pre-rendering - yes i now about that, but you obviously missed the bit in my post where i said it was consistently stalling for 5 seconds throughout the run, not just the first time.
The only shader uniform being changed is the current modelview matrix (via glLoadMatrix). This is using the fixed function pipeline and using a shader that uses the built-in state (alternating between the two code paths at build-time, not run-time). I've basically traced the problem to glLoadMatrix, but seeing as though that's just a uniform under the hood, I imagine it happens for any 16 float uniform changed at that frequency...although I haven't checked.

You're absolutely right - this perhaps deserves a new thread.

peterfilm
03-10-2010, 09:20 AM
Dark Photon, i've just moved the thread to one titled "Quadro page faulting for 5 seconds all the time".
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&amp;Number=273623#Post2736 23

dorbie
03-12-2010, 03:41 PM
Reading the clock in animate is conceptually flawed.

Don't think so as I it is called directly before rendering. I also called the animate method from within the rendering method. This is only a flaw if the animate method is called asynchronous which not even GLUT does.


No he's right it is flawed but it's usually not a major issue, the time used is based on the past frame the time in future cannot be known. You can low pass the timer, you can run simulation asynchronously and then render the latest frame at any given time, but you cannot see into the future and predict teh time this frame will take to display and where things should be by then.

This is probably not your issue though.

fgreen
03-13-2010, 05:32 PM
[...]
No he's right it is flawed but it's usually not a major issue, the time used is based on the past frame the time in future cannot be known.
[...]


Okay, I see. I think that this flaw could lead to very odd behaviour in the case of enabled wait for vsync. But, as you already said, I think that this flaw is not my problem as it also occurs with vsync disabled.

But it might be worth it to meassure the time difference between the animation time and the termination of the flushing method. If there are any bigger differences that might cause in stuttering. I will just see if this is the case.

knackered
03-14-2010, 05:07 PM
that's a scary picture, dorbie!

MalcolmB
05-20-2010, 01:36 PM
Hey fgreen, did you get any further with this problem?

fgreen
10-12-2010, 03:50 AM
Hey fgreen, did you get any further with this problem?

No. I have not been able to fix this issue and I do not think that there is a way to cope with this problem from within my application.