PDA

View Full Version : OpenGL vs Direct3D



skater_g
03-13-2001, 12:18 PM
I'm a writing a persuasive paper for HS and I'm wondering what experiences people have had with OpenGL and Direct3D. Which do you personally prefer and why? I only use OpenGL. Are there any drawbacks that you have found in your preferred API? Is there a different preference when using different languages, such as Visual Basic or C? Are there any areas of improvement that you think will need to be addressed in future versions? Please tell me whatever you are willing to share. I really would value your input. Thanks.

Bryan

john
03-13-2001, 02:30 PM
Hi! I'm wondering how to fuel a bush-fire. see, i've got several hundred drums of petrol, and this works, *BUT* i can only turn several hundred hectares of scrub into a smouldering mess. I was wondering... has anyone had much sucess with nuclear weapons?

cheers,
John

[This message has been edited by john (edited 03-13-2001).]

Gorg
03-13-2001, 03:21 PM
Here we go again!!

Deiussum
03-14-2001, 10:09 AM
John, I think your nuclear weapon idea sounds like a good one. Not only do you get a lot of initial damage, but also the benefits of the whole mushroom cloud and nuclear fallout thing.

Skater_g, do you seriously think you're going to get objective opinions on OpenGL vs. Direct3d on a board that is primarily to discuss OpenGL? http://www.opengl.org/discussion_boards/ubb/smile.gif Post the same thread on a Direct3d board and you'll get opions that are completely different.

So far as differences between the two, OpenGL is easier to learn for many people, but a lot of people claim D3D is easier...

OpenGL has the whole extension thing going for it, while D3D gets a new update every year, (and in some cases those changes require a drastically different approach to the way you program.)

skater_g
03-15-2001, 05:39 AM
Deiussum,

Thank you for your response. It is all I need for the 4th source for my paper. Hehe....I knew it wouldn't be the best idea to post the message on an OpenGL board, but since I'm writing my paper to persuade people to use OpenGL, I figured this would be a good place to get some info......some serious info.

In case anyone is wondering, I cannot stand to use D3D. I dislike it with a passion, and I refuse to code with D3D anymore. Sorry though, I shouldn't have even brought it up.....

john
03-15-2001, 03:30 PM
but, if you dislike something, then you presumedly know WHY you dislike something. I don;t like.. oh, say. COBOL because of very well defined reasons: the syntax structure is brain DEAD, the semantics are screwed in the head, and any compiler on a unix system that is called "rmc" is just screaming for trouble. ("rm" being the ReMove command... which is nice, but not when some of us press space bar to oearly=)

but... i know why i don't like cobol, and i can site reasons why it IMHO sucks... so i don't need to ask ppl why it sukcs when i already know why!! so, if you've used D3D and don't like it (which you say you don't...) then... why ask ppl for their opinions when you can form one yourself? "i think opengl is better than d3d for reasons X, Y and Z, with the following supporting evidence"...

feh!

cheers,
Hohn

andreiga
03-18-2001, 04:40 PM
With OpenGL you can have per-pixel diffuse+specular lighting on Geforce2 family chipsets (GTS, MX etc.) through the register combiners extension, like in the Doom3. I said like in Doom3 but this doesn't mean that Doom3 will run on a Geforce2. Indeed, Geforce3 it's more powerful (adds more combiners and constants plus a new extension named texture shader), but this doesn't mean that Geforce3 it's a must (for now).
Unlike OGL, DX8 wants only the new advanced NSR from Geforce3 (which it's STUPID) and makes Geforce2 a Geforce1 with more fillrate.
In conclusion, in the future you may see games with per-pixel ligthing running only on OGL (because of the number of Geforce2 family chipsets on the market).
One other important thing it's the support of the fence/VAR extension which is MUCH better implemented than the DX8 vertex buffers, so you can have much more triangles per frame rendered (believe me, I'm first a DX programmer and second an OGL programmer).

P.S. Vertex programming extension (or DX8 vertex shader) is a plus of Geforce3 but i've seen recently some benchmarks made with 3dMark 2001 and the emulation (which i think it's done completly on the CPU, but i'm not so sure about that) is almost as fast as on the Geforce3 (this extension is required as a setup of per-pixel lighting).

LordKronos
03-18-2001, 05:05 PM
Originally posted by andreiga:
With OpenGL you can have per-pixel diffuse+specular lighting on Geforce2 family chipsets (GTS, MX etc.) through the register combiners extension

Well, the drivers are supposed to have a way that allows D3D8 apps to get access to the register combiners. Cant say for sure which functionality is available this way, but its something you should be made aware of.



Originally posted by andreiga:
Unlike OGL, DX8 wants only the new advanced NSR from Geforce3 (which it's STUPID) and makes Geforce2 a Geforce1 with more fillrate.

Just so you know, the Geforce2 IS a Geforce1 with more fillrate. While there are architectural differences between the cards, there is no difference in functionality. Both cards support the exact same features. From a user/developer standpoint, the only difference is speed.

[QUOTE]Originally posted by andreiga:
One other important thing it's the support of the fence/VAR extension which is MUCH better implemented than the DX8 vertex buffers

I cant argue this for a fact, but if you use vertex buffers correctly (discard contents & no overwrite), I believe the D3D driver should be able to make use of the fences to get optimal results. Maybe Matt can confirm/refute this, but I wouldnt be surprised if he never even comes in here (given the title of the thread).

mcraighead
03-18-2001, 09:25 PM
I don't know about DX8, but on DX7, VAR was definitely superior. I think DX8 fixes some but not all of the DX7 vertex buffer problems.

- Matt

andreiga
03-19-2001, 01:13 PM
quote:
---------------------------------------------
Originally posted by LordKronos:

Just so you know, the Geforce2 IS a Geforce1 with more fillrate. While there are architectural differences between the cards, there is no difference in functionality. Both cards support the exact same features. From a user/developer standpoint, the only difference is speed.

Reply:

Maybe what you say it's right (since i have a GTS and i never had a Geforce1 to see if the register combiners works), but don't forget that the NSR is a new component in Geforce2.
One more thing, in Nvidia Opengl SDK they show the power of register combiners and how to use them but ONLY on Geforce2 and Geforce3
(the later with an enhanced version of this extension).

Lars
03-19-2001, 04:04 PM
And you can do per pixel lighting on the Geforce1 and up under direct3D by using the dotproduct operation for the texturestagestate... don't know the syntax at the moment, but it works. You haven't got the full freedom, that the combiners give you of course.
Or wasn't there something ??? I remember a hack for the TNT cards, where you could use there combiners by setting specific stages to specific states in the d3d pipeline.
Maybe you can do this with GeForce too :-)

Lars

LordKronos
03-19-2001, 04:11 PM
Originally posted by andreiga:
[B]but don't forget that the NSR is a new component in Geforce2. [B]

What features does the NSR provide? I'll tell you...NONE. Its a marketing term. Everything (feature-wise) in the NSR was available on the GeForce 256. Now, certainly the 256 didnt have 4 pixel pipelines (which is part of the "NSR" thing), but all additional piplines equates to is performance. Everything a Geforce 2 can do (or at least everything nvidia has disclosed thus far) can be done on a GeForce 256.

andreiga
03-19-2001, 05:07 PM
Originally posted by Lars:
And you can do per pixel lighting on the Geforce1 and up under direct3D by using the dotproduct operation for the texturestagestate... don't know the syntax at the moment, but it works. You haven't got the full freedom, that the combiners give you of course.
Or wasn't there something ??? I remember a hack for the TNT cards, where you could use there combiners by setting specific stages to specific states in the d3d pipeline.
Maybe you can do this with GeForce too :-)

Lars

As i said before i don't know how a Geforce1 is working under OGL (maybe is working the same as second generation), but the point is that under DX everything is set as render states (light vector etc.) and you can do only diffuse per-pixel lighting (you can't set the half-vector, specular power which are necesary for specular lighting). I have to remember the name of the topic: OGL vs D3D. Of course, if you have a GF3 the same lighting computations can be done under DX8 and OGL, but how many GF3 are on the market?

Gorg
03-19-2001, 07:58 PM
You can use EMBM in Direct3d to make specular highlights if you don't have access to pixel shaders.

kaber0111
03-19-2001, 10:30 PM
>to is performance. Everything a Geforce 2
>can do (or at least everything nvidia has
>disclosed thus far) can be done on a
>GeForce 256.
http://www.opengl.org/discussion_boards/ubb/smile.gif

marketing scam, hehe
microsoft pasting in game characters into there screenshots...

now that _IS_ a marketing scam.
pfff.

XBox is a blunder;
yeah, i like the hardware, but personally i think it will be an enormous flop.
ms is going at this in the _wrong_ way,
and it's going to cost them.
and i think the street is frowning on there actions as well.

-akbar A.

andreiga
03-20-2001, 03:15 AM
Originally posted by Gorg:
You can use EMBM in Direct3d to make specular highlights if you don't have access to pixel shaders.

Yes, but EMBM is not suported on GeForce1 and GeForce2, and again, how many GeForce3 are on the market? (i'm not considering the G400 because of the poor performance and ATI RADEON because of the very, very, poor drivers).

Roderic (Ingenu)
03-20-2001, 03:26 AM
KYRO and KYROII (PowerVRS3) supports EMBM.
They are best quality/speed/price ratio on the market.

LordKronos
03-20-2001, 03:54 AM
Originally posted by kaber0111:
marketing scam, hehe
microsoft pasting in game characters into there screenshots...

now that _IS_ a marketing scam.
pfff.

Marketing scam? Well calling it a scam is debatable. Certainly the features of the NSR werent new compared to a GeForce256. However there was a performance increase, and as someone from nvidia said to me on the topic once (Im paraphrasing) "no its not new, but often the increased performance can make the difference between these types of effects being feasible or not". Then again, isnt most marketing a scam in one form or another?

As for the XBOX screenshot "scam", I would disagree with you, but thats way off topic so I wont bother.

cass
03-20-2001, 04:19 AM
Being in the marketing dept at NVIDIA, I would say it's about "spin" and timing. There was a right time to push hard for NV_register_combiners as a "branded" feature -- and that time was GeForce2 launch. NSR was a much more suitable name for marketing.

NV_register_combiners was pushed to developers from the GeForce256 on, because for consumers to enjoy a feature like NSR, developers have to program to it.

Thanks -
Cass

ET3D
03-25-2001, 11:47 AM
I'm programming OpenGL now, but I'm following D3D, and might start programming it at some point. At least in terms of simplicity, I think that D3D is getting better with every version. I like the easy loading of textures (including video textures), and the idea of Effects. I'm glad that when I'll finally get into it, it'll be considerably easier than when I started following it (which was with DX6).

Nutty
03-25-2001, 12:20 PM
bah.. dont you mean "considerably gayer"??? http://www.opengl.org/discussion_boards/ubb/smile.gif

I really can't understand the fuss about D3D? I can't find one single reason to use it over OpenGL. If someone can give me some good reasons to learn it.. I might just do that..

M$ have already buggered up the OS market with their flimsyware/bloatware crap OS's.. and even worse products.. (though in defence I do like Visual C++) And it seems they want to dominate the 3D market with their lame plagurised API.

I'll be smiling ear to ear the day M$ go bankrupt... http://www.opengl.org/discussion_boards/ubb/smile.gif



Nutty.

jwatte
03-25-2001, 04:43 PM
any compiler on a unix system that is called "rmc" is just screaming for trouble

Do what I do, and just rename it "cc" for "cobol compiler" :-) :-)

kaber0111
03-25-2001, 08:26 PM
>I really can't understand the fuss about
>D3D? I can't find one single reason to use
>it over OpenGL. If someone can give me some
>good reasons to learn it.. I might just do
>that..

Umm, I know this is a OpenGL forum...
but, it's something called fragmenting.

like ati has "there extension" to do dot3lighting in OpenGL,
where nvidia has there "own way" to do this
as well..

but in D3D, it's unified and there is only 1 way/interface to do it...

CViper
03-25-2001, 10:46 PM
Well i've been "learning" D3D out of curiosity lately... Well out of mistake d3d rm (retained mode or something)... First you have to battle your way through about 10'000 com objects, and then you still cant do what your really want to (you can load files directly, but specifying verticles directly well forget it). I guess d3d im (immediate mode) is somewhat better in that way (i don't really know though).

And a little note to the xbox stuff: They froze Halo (the bungie game) until theyr going to release the xbox... just because that i hate m$... and that they support d3dmore that ogl makes that just worse

Nutty
03-26-2001, 01:32 AM
It's nothing more than trivial to write a wrapper for several extensions of the same functionality.. Thats basically all D3D does...

Though with OpenGL, you might get access to more features (ala pixel shaders) as you're not limited to D3D's implementation.

Nutty

zed
03-26-2001, 02:56 AM
>>like ati has "there extension" to do dot3lighting in OpenGL,
where nvidia has there "own way" to do this
as well..<<

i just noticed a few more extension specs have been posted including http://oss.sgi.com/projects/ogl-sample/registry/ARB/texture_env_dot3.txt

(wtf is there a clear fields button for, he saiz after pushing that instead of send)

kieranatwork
03-26-2001, 11:32 AM
I don't know about in America, but here in England, seeing Bill Gates announce the XBox in his knitted sweater accompanied by some annonymous WWF wrestler did nothing for its' potential appeal to 18-30 year olds...he doesn't seem to realise that the average console buyer does not know or care what pixel/vertex shaders are, or that it's fill rate is double/treble that of the PS2 or whatever whatever whatever...they're interested in innovative, exciting and most probably Japenese games.
They'll continue to buy PS2's, and the XBox will go the same way as the MSX...straight into the classified ads in local newspapers. Shame...but I hope to god that NVidia have not invested too much in this doomed project - I think they deserve better...

Nutty
03-27-2001, 12:19 AM
Not sure I agree with that. Being in the games industry myself, I reckon it's gonna totally stomp all over PS2. It's basically a fixed spec highly optimized PC... and there are loads of developers out there that would love to get their hands on this thing..

PS2 aint really that hot.. it's really useless at textures.. only 4 meg of VRAM, and no hardware texture compression.

Another good thing about XBox, is that it's really easy to develop for.. Basically just get a geforce 3 pc.. with dx 8, and you're 90% there.. Very easy to port existing Pc games too.. especially if they were already written using D3D..

Nutty

zed
03-27-2001, 12:10 PM
ive written a paper entitled "why the xbox wont suceed"
as nutty saiz, its gonna be a hit with developers unfortunately the developers DONT buy the games.
kieranatwork is far closer to the money

maxuser
03-27-2001, 01:36 PM
Few people despise M$ more than I do, but in the PC software industry, they're the 800-pound gorilla that can bully everyone else, and they usually get what they want. Why else could an inferior API like the early versions of D3D survive? M$'s often successful strategy is to put out an inferior first version (like Windows, IE, D3D, etc.), let the industry insiders laugh at it, keep chipping away at market share with iteratively better versions, until they completely dominate the market with a product that is "good enough." I doubt XBox will be any different.

maxuser
03-27-2001, 01:58 PM
As far as the original topic (D3D vs GL), it's a matter of personal preference and intended use. If you like pure, clean, "academic" API's, OpenGL is for you. (It's clearly not just academic, as John Carmack as proven.) If you don't mind, or even enjoy, getting your hands dirty with often needless complexity and you care only about supporting the latest 3D features on Windows, then D3D is worth a look. I personally like to experiment with somewhat academic 3D stuff on Mac OS X, so OpenGL is the obvious way to go. BTW, OpenGL support on Mac OS X is far superior to the second-class support that M$ provides (you can write an OS X OpenGL-based screensaver in about 50 lines of extremely simple code; try that on Windows), but that's another topic...

Nutty
03-27-2001, 11:39 PM
Yeah.. Mac OpenGL seems to be getting even more support. Didn't I read somwhere that apple have dropped their own api, to push OpenGL more on the Macs

If only M$ would do that too... http://www.opengl.org/discussion_boards/ubb/smile.gif

Dont you think it would be better if MS dropped D3D, and pushed for better OpenGL support under windows, then all major platforms and OS's would have a common top of the range 3D api.. it would rock.. but no.. instead they have to be stubborn gits, and force their ****e on us as always.

Nutty
03-28-2001, 01:05 AM
oooops... having trouble accessing forum boards.. laggy.. and unresponsive. hence double post.. wont let me delete it though..

odd.

maxuser
03-28-2001, 05:00 PM
Apple realized that they didn't have the market share or momentum to push their own 3D API (QuickDraw 3D), so they wisely adopted OpenGL for Mac OS 9, and inherited it from NeXTStep/OpenStep for Mac OS X. (BTW, there's a Q3D-compatible API implemented with GL called Quesa, for those interested in yet another retained-mode layer over GL.) M$, on the other hand, *does* have the market share and momentum to successfully push their own API. Smaller platforms (like Mac and Linux) can only survive by adopting standards, whereas larger platforms (M$) will survive by driving a stake into the heart of any standard that would level the playing field with the small guys. They've done it with client-side Java, and I sure as hell hope they don't do it with GL. I don't like it, but that's how business works.

andreiga
03-30-2001, 02:12 AM
M$ hates OpenGL because they can't control it (no company to buy in order to own OGL). Besides that, they hate it even more because OGL specifications are not made by marketing guys (which it would be very wrong).

P.S. I'm a marketing guy at a software company.

Hull
03-30-2001, 01:09 PM
And that's why we love it http://www.opengl.org/discussion_boards/ubb/tongue.gif

I personally dislike DX because of its platform dependancy and the M$ evil plans
to 'take over the world' behind it http://www.opengl.org/discussion_boards/ubb/rolleyes.gif

I used to think Mac was a waste of money, but with their frenetic support of OpenGL and impressive good judgement, I have been starting to think about buying one and start developing on it.
( money issue only here. )

I think JC have something to do with it to http://www.opengl.org/discussion_boards/ubb/wink.gif

Nutty
03-31-2001, 01:57 PM
Been talking with some of my mates from work.. and we still dont seem to be able to percieve why Xbox might fail. Maybe you're under the impression, that it will cost loads, seeing as the gfx behind is gonna be a superset of Geforce 3. It wont. Apparently it will take M$ 5 years to get into profit due to the loss they will make selling the machine at such low cost.

If I saw a console out there with a better than geforce 3 spec'd graphics system in it.. for say 200 quid.. BARGAIN! I'd jump at the chance to get one.

I still reckon Xbox will totally stomp over PS2... Probably gamecube too.. but that looks like a much nicer system than ps2 as well... I really think that sony have misjudged with ps2...

just my tuppance worth.

Nutty