Am I using my 3D card?

I’m taking an OpenGL programming class. I asked my professor this question, but he didn’t know. Do the programs I write (Win32 console apps using standard gl, glu, and glut calls) automatically use my 3D card? If so, how can I test it to be sure? If not, how do I enable HW acceleration?

If it matters, I’m running a GeForce 2 with the latest drivers, and using VC++. Thanks.

It’s pretty hard to turn acceleration off.
You can ensure by rendering few thousands of triangles[best with enabled blending and textured]. If fps will be more than 10, it’s accelerated ;-]

[This message has been edited by RandyU (edited 03-10-2001).]

depends what u do eg
the geforce2 doesnt have a hardware accelerated accumulation buffer so if your program uses the acculmulation buffer it wont be accelerated.
also with a geforce stencil buffers are only accelerated when your window is set at 32bit colour not 16 bit colour.
basically what im saying is theres no hard and fast rule to know if something is hardware accelerated or not.
BTW most thing on the geforce are hardware accelerated (some more than others)

A way to check if you are using the default Microsoft Implementation (which would all be software) or the driver provided by your chip/card manufacturer would be to check the string returned by glGetString(GL_VENDOR); If it’s an MS driver, it’s software. If it’s not, it’s a driver that COULD be using hardware for some things. (Not necessarily all things, though.) If you’ve got a GeForce, your probably using hardware for a lot of things. Nvidia is great at providing good OpenGL support. If you haven’t already, I’d recommend using their Detonator drivers as they are REALLY good. There is supposed to be a new version of them out sometime soon. (There’s a rumor on one of these boards that they’ll be out this Mon.)