PDA

View Full Version : Strange Problem With Rendering Speed...



ndj55
07-23-2002, 10:58 AM
Hi all!

I have a strange problem...
I have 2 computers,the first one is an athlon 700 Mhz with a TNT2 card and the seconde one is a 1000 Mhz with a Geforce 2 GTS... I am working on a program with about 16000 triangles to draw. There are about 2000 triangles per display list...
The problem is that the frame rate is higher on the low-speed computer (700+TNT2),it's near 76 FPS. The frame rate on the high-speed computer is about 25 FPS.
My two PCs have the latest drivers and the high-speed computer works very fast with other OpenGL programs/games.
I tried several stuffs such as removing the lighting,the texturing,the back face culling and the COLOR_MATERIAL,but it changed nothing...
Just by removing the glNormal3f functions,the frame rate went near 40 FPS.
I am sure that the problem comes from something in the OpenGL part of the prog because I made a test program without DirectInput,DirectSound and other stuffs like that...

So do you have an idea?
Do some functions work faster with some Graphics Card?

Does someone want to see my test program?

BlackJack
07-23-2002, 11:33 AM
Hm...... if you would paste some code that meight help. How ever... what I'm guessing is that there are things like Anisotropic filtering or Antialiasing enabled by default on your drivers... you should check the config... that meight raise a big difference... may also be that there is something configured wrong with your AGP port or in your bios... very hard to say. Ever tried to put the GeForce into the PC with the TNT2 and check the framerate then? As I know displaylists are possibly stored in AGP or video memory. On my old computer my AGP memory was far slower than system memory... some weird drivers and so... so if the driver meight think it does something good through storing them in AGP... this meight raise this awful FPS as well.

BlackJack

ToolChest
07-23-2002, 02:36 PM
What is you gl initialization code? Do you enable and disable almost every state like fog and lighting (if not used) or just the ones you use? For some reason opengl doesnít seem to have a standard for the way states are enabled when you create your context or maybe card vendors just ignore it, I donít know. A good rule of thumb is: if you didnít set it DO NOT assume itís off. Even if it works on every card you could test on someone will be running some ancient card that will throw you a curve ball.

hope this helped...

John.

SirKnight
07-23-2002, 03:44 PM
Is FSAA turned on on the faster computer? That could explain a low framerate on there.

-SirKnight

ndj55
07-24-2002, 09:02 AM
The FSAA is off on the fastest computer yes...
Perhaps that if I use glDrawElements instead of Display Lists,it will solve the problem... What is your opinion?
I am going ot try that...

Here is my OpenGL Initialization code:
(Comments have been removed....)
--------------------------------------------
GetClientRect(hWnd, &wndRect);
ghDC = GetDC(hWnd);

if (!SetupGLPixelFormat(ghDC))
{
PostQuitMessage (0);
}

ghRC = wglCreateContext(ghDC);
wglMakeCurrent(ghDC, ghRC);
SizeGLScreen();

GLfloat LightAmbient[] = { 0.5f, 0.5f, 0.5f, 1.0f};
GLfloat LightDiffuse[] = { 0.8f, 0.8f, 0.8f, 1.0f};
glLightfv(GL_LIGHT0, GL_AMBIENT, LightAmbient);
glLightfv(GL_LIGHT0, GL_DIFFUSE, LightDiffuse);
glEnable(GL_LIGHT0);
glEnable(GL_LIGHTING);
glEnable(GL_COLOR_MATERIAL);
// glColorMaterial(GL_FRONT, GL_AMBIENT_AND_DIFFUSE);

glCullFace(GL_BACK);
glEnable(GL_CULL_FACE);

glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 1.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);
glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST);
glHint(GL_LINE_SMOOTH_HINT, GL_NICEST);
---------------------------------------------

ToolChest
07-24-2002, 12:25 PM
ya I noticed you don't have any glDisable() in that code. you really need to disable every state in opengl that you are not using. who knows what is lurking in the background of you app... http://www.opengl.org/discussion_boards/ubb/smile.gif

John.

zeckensack
07-24-2002, 03:43 PM
Originally posted by john_at_kbs_is:
ya I noticed you don't have any glDisable() in that code. you really need to disable every state in opengl that you are not using. who knows what is lurking in the background of you app... http://www.opengl.org/discussion_boards/ubb/smile.gif

John.Sorry, but that's bs. There's a default state for everything in OpenGL. And NVIDIA's drivers seem to obey the spec quite strictly.

ToolChest
07-24-2002, 04:15 PM
I'm sorry I didnt know NVIDIA's card were the only cards on the market. besides you never trust someone else to take care of you code for you, that's called bad programming... http://www.opengl.org/discussion_boards/ubb/wink.gif

zeckensack
07-24-2002, 06:49 PM
Originally posted by john_at_kbs_is:
I'm sorry I didnt know NVIDIA's card were the only cards on the market. besides you never trust someone else to take care of you code for you, that's called bad programming... http://www.opengl.org/discussion_boards/ubb/wink.gifndj55 said both machines have NVIDIA cards. A TNT2 and a Geforce2. http://www.opengl.org/discussion_boards/ubb/wink.gif

Of course they're not the only ones on the market. Actually I have a Radeon8500 im my primary box. http://www.opengl.org/discussion_boards/ubb/smile.gif

ndj55
07-25-2002, 12:48 AM
Take it easy dudes http://www.opengl.org/discussion_boards/ubb/smile.gif
So nobody have any opinion about transforming Diplay Lists into Arrays?
I don't have a lot of time at the moment but perhaps I could give you an URL to get the prog and test it on your computer?

Eric
07-25-2002, 12:50 AM
It looks like you are asking for the nicest polygon smooth. I think the default state for polygon smoothing is OFF but you may have enabled it somewhere else.

Now, cards up to the TNT2 used to ignore the GL_NICEST hint for polygon smoothing: they basically didn't perform ANY polygon smoothing (Matt, correct me if I am wrong). But when the GeForce came out, a lot of people (including me) found that their app was slowing down dramatically. That was because the GeForce chips actually DO polygon smoothing and this is really slow.

Try disabling polygon smoothing and everything should be fine.

Regards.

Eric

ndj55
07-25-2002, 01:08 AM
nope
it changed nothing...
The frame rate is still around 24 FPS...
It's really crazy... About 76 FPS on a TNT 2 and 24 on a geforce 2 GTS....

Eric
07-25-2002, 01:23 AM
Can you send a test app that I could run here (Dual Athlon MP 2000+ with GF4Ti4200) ?

It's very difficult to help you without seeing the running app (i.e. all we can do is give you the "usual" advice...).

Regards.

Eric

ndj55
07-25-2002, 03:10 AM
OK
Get all right here: http://openair.free.fr/public/TestProg.zip

There are four CPP files:
main.cpp = main file
3DSLoader.cpp = My 3ds loader
GLFunctions.cpp = Texture Loader
GLWindow.cpp = The Class Of The Window

Init of OpenGL is in GLWindow,display lit generations is in 3DSLoader...

Please,no bad jokes about my coding style...

Eric
07-25-2002, 03:22 AM
OK, I have it here: I'll have a look at the code to see if I can find anything suspicious.

Is it possible to get the ".3ds" files you use as well or are they confidential ?

I could use any 3ds file I have but then we won't be able to compare the frame rates.

Regards.

Eric

ndj55
07-25-2002, 06:33 AM
Actually,the 3ds files are in the zip...
There are 8 3ds files...
My results with this test program is:
On the high-speed computer,about 24 FPS.
Ont the low-speed computer,about 76 FPS.

I can give you another 3ds files but the ones in the zip are perfect for the test...

SirKnight
07-25-2002, 07:10 AM
I got from 200 to 225 fps on my sys.

Specs:
P3 600MHz
GeForce 4 Ti 4400 w/ Det. 29.42
256MB PC133 RAM
Windows XP Pro
etc... http://www.opengl.org/discussion_boards/ubb/smile.gif

-SirKnight

SirKnight
07-25-2002, 07:13 AM
Also i'm running with Quincunx AA turned on and with Anisotropic on at 8x (if that matters in this demo at all).

BTW, I like how you're doing your main menu (even if it doesn't let me select anything http://www.opengl.org/discussion_boards/ubb/wink.gif); looks pretty cool.

-SirKnight

ndj55
07-25-2002, 07:29 AM
thx SirKnight!
The menu works in the real program but I removed the DirectInput layer for this test prog... There aresome sound too (ripped from tony hawk 3) but removed too...

V-man
07-25-2002, 09:06 AM
getting 192 FPS

P3 450
Gf2 GTS 32 MB latest drivers
NT4

I might add that your menues are spinning too fast and your coding style is easy to follow.

V-man

PH
07-25-2002, 09:15 AM
I ran your test too,

AthlonXP 1800+
Radeon 8500
Win2000

I got 299.75 fps.

ndj55
07-25-2002, 09:56 AM
wtf... my computer is really strange!
Perhaps something with my AGP port but all my games work perfectly...

V-man -> Thanks.It's normal that the menu spinns so fast,it's just because there is no speed limitation in this prog... Every frame,the angle of rotation is increased...
I am glad to read that you like my coding style http://www.opengl.org/discussion_boards/ubb/biggrin.gif

PH -> Thank you too,I didn't test my program on an ATI card.

[This message has been edited by ndj55 (edited 07-25-2002).]

Eric
07-25-2002, 11:39 PM
OK, my mistake... I started your project in debug mode and the 3ds files were in the release folder !

I ran the test here and got 564.33 fps (2x Athlon MP 2000+, GF4Ti4200, WinXP Pro).

Looks like your machine is really odd... Have you tried swapping the cards between your 2 computers?

Regards.

Eric

ndj55
07-26-2002, 05:18 AM
hum...
I think I am going to have a look on my computer and his geforce 2 gts...

MattS
07-26-2002, 06:40 AM
Hi,

I'm afraid this is more of a sympathy post. I have exactly the same problem with one of our machines here. None of my OGL apps run very quickly (and it dents my pride <sob> ). So I thought I would test you app on that computer and it performed very badly.

OK my machine (which works fine), dual PIII 550 and GeForce4 ti 4200 and it managed 356fps

However the other machine is a PIII 733 with Geforce 256 and managed a mere 8 (yes 8). I switched it to 32 bit colour and it leapt up to 29.

I wish I had a more productive answer for you but I have not yet got to the root of the problem. I have tried swapping cards around and upgrading drivers but it still has the same problem so I am sure it is either the windows install or the motherboard.

The user of the machine is away today so I shall see if I can find out why it does so badly. The only way I have found of extracting even a half decent performance is by using VAR and even then it has to be in VRAM not AGP memory. Oh yeah, I've also ran Quake III on it, and while not superb it was managing about 60fps (my machine manages about 95), but I have no idea how many triangles is typical for a QIII viewport.

Anyway if I find anything I will post them here.

MarcusL
07-26-2002, 11:55 AM
Sounds like you haven't got any AGP-memory. Check that the bus-type is AGP (not AGP (PCI Mode)) in the control panel (assuming you're using windows).

Also try some wglAllocateMemoryNV and see if you can get AGP-mem.

ScottManDeath
07-26-2002, 02:18 PM
Hi

I tested it:

PII 450 TNT2 det 29.42
desktop 1024x768x32 bit : 53fps
desktop 1024x768x16 bit : 8 fps

Maybe the drivers 16 bit code path is not really optimized or you get a crazy pixelformat ?? Or you use different desktop resolutions

Bye
ScottManDeath

DelNeto
07-27-2002, 06:32 AM
Hi,

Give a look at Vertical Sincronization. It must be disable.

tcobbs
07-27-2002, 08:26 PM
Perhaps you are getting a non-accelerated pixel format? I get around 350 fps on my computer with a GF3, but when I rename the OpenGL driver (nvopengl.dll) to force software rendering, I get 16 fps. That's at least in the same ballpark as 25 fps.

What happens if you add PFD_GENERIC_ACCELERATED on the pfd.dwFlags line? If you get a failure on the GF2 after adding this, it would explain why a non-accelerated format would be chosen. If that's the case, you'll then have to figure out why you're getting a failure. Perhaps the cColorBits needs to be 24 instead of 32?

kevinhoque
07-28-2002, 02:33 AM
ndj55, tcobbs was right. You are not always getting an accelerated pixel format. Want to know why? It's because your pixel format descriptor structure is filled with garbage when you pass it to ChoosePixelFormat. Initialise all structure members to zero and that corrects the problem. i.e. this

PIXELFORMATDESCRIPTOR pfd = { 0 };

ALWAYS, always, always, zero your structures before you do anything with them. Be they Win32, directx, opengl, etc.

Kevin

pbrown
07-28-2002, 06:07 PM
Originally posted by kevinhoque:
ndj55, tcobbs was right. You are not always getting an accelerated pixel format. Want to know why? It's because your pixel format descriptor structure is filled with garbage when you pass it to ChoosePixelFormat. Initialise all structure members to zero and that corrects the problem. i.e. this

PIXELFORMATDESCRIPTOR pfd = { 0 };

ALWAYS, always, always, zero your structures before you do anything with them. Be they Win32, directx, opengl, etc.

Kevin

Not sure that zeroing matters (there is a flags field, right?), but even if you get everything else right, you may still end up without an accelerated format. In some platorms, the Z buffer and the color buffer had to be the same size (16/16 or 32/32). Also on some platforms, pixel formats with 16-bit Z buffers will have no stencil (or a separate software stencil). Put the two together, and you would always fall back to software in 16-bit mode if you ask for a stencil buffer.

Eric
07-28-2002, 11:47 PM
Originally posted by kevinhoque:
ALWAYS, always, always, zero your structures before you do anything with them. Be they Win32, directx, opengl, etc.

Not a bad advice but remember that some structures under the Win32 API have one member that indicates the size of the structure: better not set this one to zero...

Now, to get back to ndj55's problem, it should be easy for him to check whether he gets an accelerated format or not. If he gets one, then I think macke is right: there must be some kind of AGP problem... I am not sure if this can be checked with PowerStrip.

Regards.

Eric

kevinhoque
07-28-2002, 11:56 PM
I've seen this specific problem occur many times with people worrying about why they were getting mixed rendering results. On some machines rendering was being done in hardware on others software. Every single time it was because their pixel format descriptor structure was filled with garbage. Garbage In -> Garbage Out. ChoosePixelFormat will *try* to provide a hardware accelerated pixel format, but if you give it undefined parameters (that by their very nature can be anything) then is it any surprise that software rendering frequently occurs?

As to the 'field flags'. There is a structure member dwFlags. This has nothing to do with whether specific fields in the structure are valid (or not). These flags are general behaviour flags that you would like the pixel format to conform to i.e. PFD_DOUBLEBUFFER, etc.

On the machines where software rendering is occuring, ndj55 in all likelyhood will be *asking* for stencil buffers and accumulation buffers, etc, *even though he did not explicitly specify these* - just because his structure is filled with garbage. A fall-back to software rendering is the outcome in most cases.

Kevin

Eric
07-29-2002, 12:06 AM
Kevin,

I have had a look at ndj55's code and the pixel format he's asking for looks perfectly correct.

I agree with you that in most cases the problem can come from a wrong PIXELFORMATDESCRIPTOR structure but I do not think this is the case here.

As I said in my previous post, it should be easy to check whether the format he gets is accelerated or not !

Regards.

Eric

kevinhoque
07-29-2002, 12:06 AM
Eric! Obviously structures should be zeroed *before* use. They should always then be filled as per API guidelines. Shame on you for trying to score cheap points.

ndj55 is the second person that I've spoken to within the last 7 days who had this same problem. ndj55, if you are listening then make the change to your code that I suggested and then give us an update. I've already made the change On my machine the framerate jumped from 10Hz to 300Hz.

Kevin

kevinhoque
07-29-2002, 12:21 AM
Eric, I hope that it is early morning where you are since your logic is slight awry. You previously reported that you were getting blinding framerates (564.33 fps). Why on earth would you think that you would see things going wrong with the pixel format structure on your machine?

On my more humble machine I originally got 10Hz (generic) but with the code change 300Hz (nvidia). What are you going to say now? That you don't believe me? Many others before me mentioned that perhaps his problem was related to a non-hardware accelerated pixel format. I am only confiming the obvious. Someone out there (who suffered from low framerates, so not you eric :]) please recompile the code and second this finding.

Kevin

Eric
07-29-2002, 12:24 AM
Originally posted by kevinhoque:
Eric! Obviously structures should be zeroed *before* use. They should always then be filled as per API guidelines. Shame on you for trying to score cheap points.

Kevin,

I am not here to score anything.

Actually, you may very well be right: I thought that ndj55 zeroed his structure before passing it to ChoosePixelFormat. I have just checked again and I was wrong.

To be honest, I *DO NOT* zero my PIXELFORMATDESCRIPTOR before passing to ChoosePixelFormat because I *DO NOT* use ChoosePixelFormat (I use a custom function that does not use the same criteria).

Now, let's see if that fixes the problem.

Regards.

Eric

Eric
07-29-2002, 12:30 AM
Originally posted by kevinhoque:
Eric, I hope that it is early morning where you are since your logic is slight awry. You previously reported that you were getting blinding framerates (564.33 fps). Why on earth would you think that you would see things going wrong with the pixel format structure on your machine?

On my more humble machine I originally got 10Hz (generic) but with the code change 300Hz (nvidia). What are you going to say now? That you don't believe me? Many others before me mentioned that perhaps his problem was related to a non-hardware accelerated pixel format. I am only confiming the obvious. Someone out there (who suffered from low framerates, so not you eric :]) please recompile the code and second this finding.

Kevin


Kevin,

This is your 6th post on this board and this is the 2nd one that is aggressive towards me. What's wrong with you?

I know I do not have any problem on my machine but I thought I may find something wrong in *THE CODE* that ndj55 sent me: I looked at the code, didn't see anything wrong and then said so. You pointed out that the structures were not zeroed and I said you may well be right.

What more do you want? Why the hell do you show so little respect to other people?

Eric

BlackJack
07-29-2002, 12:40 AM
469.33 @ GeForce 3 / Athlon XP 1800...

p.s. Zickenzack: "Sorry, but that's bs. There's a default state for everything in OpenGL. And NVIDIA's drivers seem to obey the spec quite strictly."

Well, believing is good, unbelieving is better... if the prog shall run on Intel chipsets anyway http://www.opengl.org/discussion_boards/ubb/wink.gif.

[This message has been edited by BlackJack (edited 07-29-2002).]

kevinhoque
07-29-2002, 02:31 AM
Eric, there's nothing 'wrong' with me. Thanks for asking. :]

If you feel that I was being aggressive towards you then accept my apologies. No aggression was intended...Hence the smiley face in the last post?

Lets find out if the given advice fixes his problem without letting rip with any more 'Why the hell do you show so little respect to other people?' That outburst wasn't exactly designed to endear? :]

P.S. Thanks for admitting that I might be correct. If I'd seen that post previously then perhaps I would have added more smiley faces. http://www.opengl.org/discussion_boards/ubb/smile.gif

Kevin

ndj55
08-04-2002, 06:48 AM
Calm down people...
I am in holidays now amd i can't test the last changes you told me...
See you in 2 weeks!

kevinhoque
08-04-2002, 07:04 AM
Enjoy your hols! http://www.opengl.org/discussion_boards/ubb/cool.gif

ndj55
08-21-2002, 03:22 AM
ok
back from holidays...
I am going to have a new look on the problem now....

masterpoi
08-21-2002, 12:06 PM
athlon 1400
radeon 8500

-> 23 fps????

hmm, strange

MtPOI

Antorian
08-21-2002, 02:00 PM
Athlon XP 1800+
GForce2MX
256Mo (DDR)
Win98
29Fps (and 34fps when vertical synchro off)
(Last Detonator Aug 7th 2002)

Duron 800
Voodoo3 2000 PCI
512Mo SDRAM
Win98
23Fps

How can you get such a frame rate growning up to 500Fps???
Have you never try to understand why a such difference?

Hmmmm I not proud of my Athlon :P
(But I render in my implementation more polygon and I've run up to 85 Fps = Monitor Selected Refresh Rate)
NDJ55 I think the problem must come from Display List (Already got the same Problem) because I saw on a another forum that on some cards display list aren't managed all the same way (even NVidia Cards)
May be I'm wrong about your problem (Certainly) but if you find why we got all such differents FPS please tell me.
Aouch I can't stand my athlon goes worse than my duron :P

Miguel_dup1
08-21-2002, 07:20 PM
P3 933
512 ram.
gforce 2 ti

And I get 240 fps on that bad boy...
With vert sync and AA off.

masterpoi
08-22-2002, 09:45 AM
I tried drawing the models directly (not using display lists.) Now the framerate is 40 fps... (640x480)

MtPOI

ndj55
08-24-2002, 02:47 AM
I don't understand... I thought it was my computer but it seems to be something else... So annoying... I am still working on it,but no result at the moment....
If I zeroise the PIXELFORMATDESCRIPTOR,it dosn't change anything. I checked and I have a hardware accelerated format...

[This message has been edited by ndj55 (edited 08-24-2002).]