Use hardware Acceleration GForce2Mx.

When i’m working with OpenGL, I’m sure the hardware acceleration of the card it is not used. I have made a test with. In option Card, when Acceleration is used, a NVidia Logo appear in the right botton of the screen. Then, with my openGl programs, this logo don’t appear.
What do I do to use it.
Thanks

this is just for dx programs sure you have hardware gpu support in opengl, should be automatically ( search for nvopengl.dll or nvopengl32.dll and you got your "hardware acceleration "

btw, dont think that opengl would have more than one frame per sec when it is NOT AT ALL hardware accelerated ( most time much less ), and you dont have any extension then, so no multitexturing, and like that nothing would work wich is newer thatn 2 or 3 years

[This message has been edited by davepermen (edited 03-19-2001).]

Originally posted by Bahamut_france:

When i’m working with OpenGL, I’m sure the hardware acceleration of the card it is not used. I have made a test with. In option Card, when Acceleration is used, a NVidia Logo appear in the right botton of the screen. Then, with my openGl programs, this logo don’t appear.
What do I do to use it.
Thanks

this is in the DirectX settings window!!! so you just have the logo in the DirectX programs, NOT OPENGL. when you would have software opengl, you would have 3 extensions and normaly even no transparences, and you cant play any demo and any game ( like q3 for example ), and every thing would be daaaaaaaaaaaaaaaaamn slow

is it like that or is it smooth ( more than 1 frame per second )?

I’m importing objet files (made in 3DSMAX) for building a character…(with smooth mode)
The perfoms are lowed… 100 frames/s -> 5-6 frames/s

But the drawing is not yet difficult ? The render is Soft. In return the Geforce 2 Mx should use the hardware acceleration ?

For building of games, Direct3D will should be more rapid ???

ok, now we got something to begin working with

hm… how big is your mesh? what extensions do you use, how do you render?

when it has sometimes 100 fps, i dont think its software ( somehow not possible i think ) but something is eating up your processor speed …

(you know, even the simple cube demo from nehe ( lesson6 ) had bout 10 fps here in software, so … )

what other programs are running while running YOUR program? what do you except drawing the mesh? do you hear your hard drive rotating fast whyle rendering? what ever, i wanna know every thing

but first of all, what drivers do you use?

My program is running in release mode. Only VC++ is running at the same time
It uses lighting (only 1 light), double buffering, true color, smooth mode etc…
I’m also using object files imported from 3DSMAX.

In a 3D world, I have placed 12 imported objects (to make a character …that’s not the point here).

While turning my camera (by moving mouse…), I can notice two points:

  • while the program is drawing only one element ( i.g. a cube…) my Computer runs as fast as usual …but when I try to move the camera around my robot (made from many cubes, spheres, etc…), it is far too slow to be displayed!!!
    Changing the mode from SMOOTH to FAT doesn’t work either.

For information, I’m using extensions: glu , glut and glaux.
I’m also using culling faces, z-buffer etc.

My Hard Drive ? don’t ear if it is rotating faster than usual . I think not. The LED isn’t often lit up …
What else ? I think the problem is in the drawing of elements

If you want to know everything else , just ask .

Thanks for help.

Also, I would like to know if exists OpenGL functions (same Zbuffer, culling, …) to upgrate the speed of my application?

An easy way to check if you’re hardware accelerated is to call the glGetString(GL_EXTENSIONS) function after creating your OpenGL context. If the returned string contains a whole bunch of extension names (GL_ARB_multitexture, GL_NV_…, etc) then, you win, you’re hardware accelerated.

Originally posted by Bahamut_france:
I’m importing objet files (made in 3DSMAX) for building a character…(with smooth mode)
The perfoms are lowed… 100 frames/s -> 5-6 frames/s

Ahh… guys. He mentions “(smooth mode)”, I interpret this as using polygon smoothing? NVidia gods say, those that use polygon smoothing “glHint(GL_NICEST)”, etc., are those that get sucky framerates.

I remember something about polygon and line smoothing on GeForce cards sucking arse in some other thread. I definitely know that using smoothed lines kills framerate, but I think Matt said the same thing about poly-smoothing too.

Bahamut, try disabling smoothing (if indeed this is what you’re using) and see what happens. Good luck.

Siwko

Oh, wait… are you talking about polygon smoothing or are you talking about smooth-shading? I think you mention the latter somewhere, in which my previous post means nothing.

Anyway, good luck.

Yeah Siwko, enabling GL_POLYGON_SMOOTH kills any app running with a GeForce (don’t know about the GeForce3, though: I am still waiting for the board to arrive !).

Actually, I experienced it when I changed my old TNT 2 Ultra for a GeForce256: all my apps were running like a snail… When I disabled polygon smoothing, everything started to fly !!!

As far as I understood, most boards won’t do any polygon smoothing (they ignore the enable/disable and the hint !) hence they do not slow your app down… but the GeForce family tries to smooth your polygons !!!

Regards.

Eric

Siwko:
i’m not using 3DSMAX models with OpenGL !!!
In program, i’m loading files in wave object format (.obj).
I’m using 3Dexploration to convert 3DSMAX to wave object.
Also, without smooth mode, there is the same problem!

Eric:
Do I must enable or disable polygon Smooth ??? for best performs
(excuse me but I dont understand all . I’m am just French…
And Polygon smooth and smooth …isnt the same thing no ?

Originally posted by Bahamut_france:
[b]
Siwko:
i’m not using 3DSMAX models with OpenGL !!!
In program, i’m loading files in wave object format (.obj).
I’m using 3Dexploration to convert 3DSMAX to wave object.
Also, without smooth mode, there is the same problem!

Eric:
Do I must enable or disable polygon Smooth ??? for best performs
(excuse me but I dont understand all . I’m am just French…
And Polygon smooth and smooth …isnt the same thing no ?[/b]

Oh, I understood that! I don’t honestly know what you’re using for models, modeling packages, etc. In fact, I wan’t even regarding that. I was referring to the OpenGL viewer/code that you’re using (writing?). Using smooth versus flat shading has a minimal impact on frame rate. Using polygon smoothing, ie: glHint(GL_POLYGON_SMOOTH_HINT, GL_NICEST) and such makes any current NVidia card drop to a crawl for framerate. To ensute it’s disabled, use glHint(GL_POLYGON_SMOOTH_HINT, GL_FASTEST) and glHint(GL_LINE_SMOOTH_HINT, GL_FASTEST), and that should extract the fastest performance out of your geometry processor and rasterizer without tweaking other bottlenecks in your code.

Mind you - this may not solve your problem if the problem is actually somewhere else.

I hope that helps.

Siwko

Thanks SIWKO.

I will try it.

Bahamut.

Actually, you don’t need to use glHint at all. You can simply use those two calls:

glDisable(GL_LINE_SMOOTH);
glDisable(GL_POLYGON_SMOOTH);

I think it is easier to deactivate a feature rather than changing one of its parameter that would have the same effect.

Regards.

Eric

P.S.: Sorry for non-french speakers. Bahamut, je suis francais aussi donc tu peux m’ecrire par e-mail si tu as des questions. Mon e-mail est dans mon profile. Reponds en anglais sur le forum, stp !

Originally posted by Eric:
[b]Actually, you don’t need to use glHint at all. You can simply use those two calls:

[quote]

glDisable(GL_LINE_SMOOTH);
glDisable(GL_POLYGON_SMOOTH);

[/b][/QUOTE]

Hmm… I never even thought of that.

I haven’t heard anyone mention the most obvious possibility for framerate death here…What is the poly count on the objects you are importing? You say you are importing several objects, spheres, cubes etc. If you are importing objects that contain spheres that were generated in 3dstudio or some modeling package, your poly count is probably really high (for real-time rendering on something like a GF2MX) When I first started importing 3d models I noticed that poly counts seemed unnecessarily high. You have to carefully manage your polygon budget when you are doing the modeling.

Originally posted by GHoST:
I haven’t heard anyone mention the most obvious possibility for framerate death here…What is the poly count on the objects you are importing? You say you are importing several objects, spheres, cubes etc. If you are importing objects that contain spheres that were generated in 3dstudio or some modeling package, your poly count is probably really high (for real-time rendering on something like a GF2MX) When I first started importing 3d models I noticed that poly counts seemed unnecessarily high. You have to carefully manage your polygon budget when you are doing the modeling.

GHoST: " When I first started importing 3d models I noticed that poly counts seemed unnecessarily high"

I know ! My poly count is not important ! 3DSM creates at beginning spheres with a lot of polygons…But you can reduce this count…

My problem isn’t here !

Hmmm, well I figured it was worth mentioning. Ok, then you should do as Olive suggested and test for extensions… NeHe has a tutorial that will output a list off all detected extensions in a graphical format.
Here is the link: http://nehe.gamedev.net/tutorials/lesson25.asp

Run that, and if it comes back with 30 or so, you are set up for hardware acceleration. If you only get like 3, than something is wrong with your gl libraries or your video card drivers. Hardware acceleration should be “automatic” if you have proper libraries and drivers.

Just out of curiosity, what is your poly count? If your rendering code isn’t heavily optimized, you’d be surprised how fast you can bog your machine down.

Hope this helps