Can I let DirectX choose the acceleration card for OpenGL?

HI!
So… I made this program and it looks quite cool, just don’t like to let it run under a small window (or a window in screensize). I would like to get the program let choose the best (or only) card to run my program with.
I know from DirectX that there are a few functions where I can just let it decide wheter just the Graphics Card or the Acceleration Card. Does OpenGL also own such a function (or header file) which allows me to choose that… or is it possible to let DirectX choose that for me!!!

Thank you very much,
Fabian

Hi!
IMHO OpenGL automatically uses your best graphic-card with hardware accelaration. So you don´t have to cast for HAL or T&L like with Direct3D.

WOW! Thanx at first…
So that means… if i have a machine with a GeForce3 it automatically runs on it, right?

But doesn’t it only support it then with the fullscreen mode, right??
So my question now is, as long as I see the X in the upper right for closing the program… it’s not called full-screen is it?? And, however, is my OpenGL application harware accelerated when I still can see the X there??

Thank you again!
Fabian

with most modern cards, yes, under most cases you are hardware accelerated even in a window. The “fine print” is of course that you have to choose a hardware accelerated mode. You’ll find a couple discussions in the past about this. However the idea is that if you request say a 16bit Z buffer and your card supports only 24 bit, your openGL program starts, but starts in software rendering mode. Not sure if this is a good example, I have been very careful to verify hardware accelerated support modes ever since I had a Voodoo2 at home and Geforce3 at work.

Good luck!

OpenGL chooses a card? No, it doesn’t. GL doesn’t care where it’s running as long as it executes commands.

If you are talking about multimonitor support, search in this group about it.

V-man

So… Thank you again!
Umm… its for sure now that SOME newer cards even support the windowed mode. OK, but what I just read, that OpenGL does NOT detect the best card? Quite confusing…
However, I am not talking about several Monitors! I jut mean the following example:
I have a Voodoo2 3D accelerator and a “normal” graphic card which runs the usual windowed program in my Windows System. When I start my application now in a windowed mode… I thought that it would NOT support the acceleration then for sure… So, for the full-screen mode… do I have to set it up then or does it really detect it automatically?

Thanx & sorry for my bad english!
Fabian

Ah, the voodoo2 is a funny beast - quite exceptional in how it works compared to cards that are younger than say 4 years old.
It’s a secondary card, that is not hardware accelerated when not running full screen. Also, it doesn’t really support opengl - as I remember you have to dynamically load the 3dfx opengl ICD yourself, using LoadLibrary, and then retrieve all the function entry points yourself using GetProcAddress, you then have to initialise the card with a special function, can’t remember what its called, and then you use the entry points you retrieved instead of the standard opengl function calls.
Or, you can rename the 3dfxgl.dll to opengl32.dll, but then you have to be careful about using unsupported opengl functions, because it’s really just a miniGL implementation.
Good luck!

(search these forums for voodoo2, you’ll probably find a lib that someones already done the LoadLibrary/GetProcAddress stuff for you).

Originally posted by knackered:
Also, it doesn’t really support opengl - as I remember you have to dynamically load the 3dfx opengl ICD yourself, using LoadLibrary, and then retrieve all the function entry points yourself using GetProcAddress, you then have to initialise the card with a special function, can’t remember what its called, and then you use the entry points you retrieved instead of the standard opengl function calls.

Well, you could just write an import library for that 3dfxgl.dll, or let some tool generate it for you, to get those function names in the import-section of the exe…
LoadLibrarys and GetProcAddresses make the code look like crap, at least from my opinion

i believe sdl will handle all this for you www.libsdl.org

IMHO you should just drop that Voodoo2 card from your list of target hardware
It’s a strange beast.

Seriously, any ‘modern’ gfx card won’t require much work from you to get hardware acceleration in windowed and fullscreen modes.

I don’t think you have to worry about it too much (unless using the voodoo2, but then I have to ask why?)

I have a dual monitor setup on a good 3d card and a cack one, I’ve never attempted to run an app on the cack one, and never seen windows XP try to… I presume Win XP gives priority to the AGP port and not the PCI, may be wrong but have never seen anything to contradict me here…

Alright!
So… to let you calm down: I am no owner of a Voodoo2! It actually has just been an example… I do have a Voodoo1 and a Geforce2… so, fact now is… that including the speed and brilliance of course… both of those cards are running completely different if I run my application, right?
So… for the case I want to distribute my OpenGL game (just for the CASE!!)… I need to let the customer know, that my game exclusively supports GeForce graphic Cards, right? If I would like to let my Prog support Voodoos… I need to make a kind of option-menu or soemthing. So, that was what I was actually asking for, because (as far as I know) DirectX detects the Card automatically… decides between HEL(L) or HAL.

Qustion: Is it correct to think that I should just write all for GeForces and not the older cards… because the most games would run too slow on the computers, supporting the older cards anyways… so thats the reason why every game currently really requires a GeForce. right or wrong?

Just a stupid question (one more), what does mean IMHO?

@Arath:
IMHO - “in my humble opinion”

(have you never heard of Babylon, the realtime translate-program? this funny thing will tell you all those funny abbreviations)

As far as I know an opengl “driver” is a whole implementation specifically written for some accelerator.So opengl doesn’t choose a card it just uses the one it was written to work with.So if someone has two cards and want’s to run a game on both of them he just has to use the correct driver(implementation) for the card he wants to use.Since implementations in windows are single DLL’s(again AFAIK,I’m using Linux) what you have to do is link the programm with the correct dll.If you want to choose in-game(wich I assume you want to do) then you’ll have to link at runtime.I’m not sure how this works(especially in windows) but some games(like Quake 2,3) seem to do it.

If your ‘customer’ (let’s put it that way) has both a Voodoo 1/2 and a Geforce (btw, there are other integrated 2D/3D cards out there too, Radeons, SiS stuff, Voodoo 3/4/5, Intel Stuff, you know …), and you just run vanilla GL code, it will run not on the Voodoo but on the other card. Voodoo1/2 doesn’t even have proper GL drivers, so there’s no need to choose.

I believe there is a quantity of misinformation in this thread. Allow me to help clarify as a somewhat technologically backward OpenGL’er who only added a TNT2 to his voodoo 1 setup a few days ago.

First of all, there is a full OpenGL solution for 3dfx owners. In fact there are two. The final 3dfx driver set, a.k.a. the Quake 3 driver set, included 3dfx’s implementation of a ‘full’ OpenGL. In actual fact I’ve found that glPolygonOffset doesn’t work on voodoo 1’s, but I think this is a bug rather than a shortcut. It doesn’t matter however, because Mesa 3d is another OpenGL for 3dfx solution, and a much better one. I found that with it my application ran faster, and with all the OpenGL calls supported as far as I could make out.

Neither require a special call to start them up.

The only complication is that Microsoft only support one device installed as the OpenGL target at a time. Prior to the voodoo 3, as has been noted in this thread, the cards could not do windowed rendering, so it is not safe to install them as the OpenGL target. Therefore it is up to the application to do one of the following things :

  • check for the existance of whatever the 3dfx OpenGL DLL is called (I forget its name), and if it is present, dynamically loads it and get function pointers from it, as advised in previous posts
  • put a note in your program telling 3dfx owners they should place a 3dfx type opengl32.dll into the same directory as they installed your game. 3dfx owners are used to this.

If following the second case, you could make it very easy for them, and because MESA is GPL, put a copy on your web page or even into your installation and include very specific instructions for downloading it and extracting it to the same place, or include a little screen in your installer asking if they have a 3dfx. Even better if you are not using an off the shelf installer - try and load the glide library and get accelerator information form it to determine what type of voodoo, if any, is present.

The only slight downside to the Mesa solution is that the makefiles for 3dfx support under windows seem to have gone missing in the 4.x release, so you are stuck with 3.x series releases, which only try to match the 1.2 OpenGL API. 4.x matches 1.3.

I have used the second solution in my current project, simply putting an extra zip file on the downloads page, and have had no complaints or queries from voodoo owners as to why it does not seem to be using their hardware. I do not believe that anyone new enough to computers not to know how to copy a file has a machine old enough to house a voodoo 1 or 2 board. I also believe people with these boards are accustomed to doing a tiny bit more work to make applications run.

IMHO <- that’s cool!

Thomas, you sound very wise! As you sayed, there seems to be a lot of information that confuses whithin this thead. But for your last statement… it seems to have a real solution based on experience, but the problem now is… that I am not able to follow you practically!

But theoretically I think, I got it then…
Thanx a lot!

Another thing:>
There are a few things that really don’t wanna go into my head! For example… when a game is written in OpenGL… and I run it, why is there an option to choose “direct3d”?!

if you can choose direct 3D, then its probably written for both api’s. if you can choose direct X, then its most probably for the rest, sound, input, etc…

but the problem now is… that I am not able to follow you practically!

Its easy. Get the file http://www.megaone.com/stuntmania/files/Mesa321.zip (~340kb) which is on my web space, but I took it from http://www.hawksoft.com/download/ where the link now seems to be broken. Strictly speaking there should be a copy of the GPL agreement (version 2 I think) in the zip file, but it seems to be missing. Add it again to be strictly legal.

MESA is GPL, so distributing it yourself is entirely legal provided the GPL text is present and you are willing to provide a means to the sources should anyone ask. If they do, just send them to http://www.mesa3d.org .

Add this file to your downloads page with a note that says ‘3DFX OWNERS : please additionally download this file and place it in the same directory as the program executable’. Job done!

Be warned though, that MESA is only a match for the OpenGL 1.2 API and has one very strange bug when rendering on 3dfx that when combining compiled vertex arrays (it supports this extension which the 3dfx OpenGL does not, which is nice) and glPolygonOffset to render quads, it breaks the quads into two triangles then draws one triangle facing forward and one facing back. This bug does not occur if you don’t compile your vertex array, or you don’t set a polygon offset, or even just if you swap your vertex edge ordering slightly and draw as a triangle strip rather than a quad.

I intend to put some effort into getting Mesa 4.0.3 to build with 3dfx support, which is a match for the 1.3 API, but have not done so yet.