Version War!

Is OpenGL 3 a real threat to Direct3D?

Why MS jumped from D3D10 to 11 that fast?

And if CAD developers are happy with legacy OpenGL calls, then why are they jumping to D3D? Not all…just one developer I believe.

Why no body was upset of D3D10? It’s API is not really friendly compared to D3D9, and it does not offer more than geometry shaders which is supported by extensions on OpenGL.

I use both APIs D3D9 and OpenGL and both are great…so what’s the point of all that flame war in reponse to GL 3???

Imho:

GL3 is no threat to D3D. GL fans moved to DX10, only one DX fan went to OpenGL (me).
It’s only DX10 that got delayed, otherwise (bi)annual version increments were the norm with DX. So, MS is not rushing with 11.

CAD… you can’t easily tell your customer he fucked-up by buying a bunch of expensive ATi cards, can you?
DX10 is easily ignorable, DX9 always works, OpenGL works only on nV.

[For the umpteenth time,] GL3 was supposed to be a fresh restart, that ATi finally do support, and where developers could easily ignore all wrong legacy docs/tutes. It had to be clean, tiny, fast, explicit. It had to present only one path - the fast-path. It had to remove bind-to-edit and things like glActiveTexture. It had to support cached shaders. In reality, it does none of the above, just moved some exts to core; drove GLSL away from caching; removed vital functions. At least it lay grounds for improvement and some future specs. But nothing we can use this decade. Nothing’s changed. Perhaps ATi were moved a little, and will support… the few core additions.
The stagnation, that Vista and ATi present, is suffocating for most, I guess.

Just my uneducated guesses, kind of.

Why MS jumped from D3D10 to 11 that fast?

What fast? It’s been a good 2 years since D3D 10. And D3D 11 isn’t a big API change.

drove GLSL away from caching

How did it do that? It didn’t discuss caching. And neither did Longs Peak (Reloaded might have had caching, but here was never any guarantee of that).

[quote=glfreak]Is OpenGL 3 a real threat to Direct3D?[quote=glfreak]

IMHO, OpenGL 3.0 in it’s current guise has not made OpenGL particular more competitive against Direct3D.

Apple and Linux taking marketshare in mobile, netbook, notebook and desktop markets are a threat to Microsoft.

This means market-share for OpenGL ES and OpenGL based products are going up, and market potential for Direct3D products as percentage of the overall market are going down.

It’s this movement of the wide industry which is the threat to Direct3D, rather OpenGL doing any better or worse job of competing.

The biggest problem I see with OpenGL getting further out into the market, regardless of version, is lack of up to date and reliable drivers. Intel and ATI not support extensions that expose Direct3D 10.0 functionality in OpenGL 2.x is a very serious problem, as is number of problems that end users encounter.

All the noise about OpenGL 3.0 being a disappointment I find rather overblown. Sure it’s not perfect, but no API is. My original hope for a lean and mean OpenGL 3.0 was about making my coding life easier, but principally down to hope that perhaps Intel and ATI might have a chance of implementing reliable and feature complete drivers if the drivers had less to do and less to f**k up.

Perhaps with the dwindling MS market share the pressure on Intel and ATI to put far more effort in the OpenGL family of API’s. I don’t want just a good OpenGL 3.x drivers, we absolutely need better OpenGL 2.x drivers too.

Give us better drivers and then OpenGL will have a better chance of competing the Direct3D in desktop gaming. With better OpenGL drivers there really any need for Direct3D at all on any platform save for the Xbox.

Robert.

Second.

Solid drivers and timely, ubiquitous feature level support - it’s a no-brainer for the desktop.

I see the problem of OpenGL from a different perspective. It’s merely implementation that makes GL looks bad and less appealing to developers. D3D reliability and stability is because it’s driver-based. IHV just write drivers as a minimal code required by the internal working of D3D. On the other hand GL makes it much more difficult to IHVs to write full implementation which is a nightmare…even if D3D was an implementation it would be hell.
Lets say we got a perfect specification of GL 4.0 or whatever future version, and then it’s more robust (vs. complex), and we got crappy implementation because the IHV team failed to make it, and I don’t blame them personally. The point is GL should first leverage itself to driver-based rather than impl based. Got a reliable GL 2.0 or even 1.5 then lets me know.

Isn’t DX9 just that, with its ultra-heavy-weight drawcalls. If so, then no, thank you :).

If you want solid drivers, then it is better to have a simple API. It is not clear what GL3 is suppose to be. Are IHVs going to write a real pure-and-clean GL3 profile that is going to be stable?
Somehow I doubt that integrated chipset makers are going to release new drivers.

Why no body was upset of D3D10? It’s API is not really friendly compared to D3D9, and it does not offer more than geometry shaders which is supported by extensions on OpenGL.

GS is only available on nVidia. We can’t really say it is part of GL yet.
I didn’t see any developer being pissed by DX10 API. It is a lean and mean API, just like MS promised to them.

That’s why I view it as a version war. The API does not any longer matter as it’s now evaluated based on what they expect from a version number. So for instance I was dreaming of a specification of GL 3.0 which is full of glNewTemplate or such then I would be totally disappointed by the released spec.

Again OpenGL is robust, not simple API made for students…it’s damn beast of calls that takes your geometry data deep down into the rendering pipeline.

I found GL 3.0 spec is just as perfect as every version. Frankly it’s a real threat to D3D. NV releasing GL 3.0 drivers now for both windows and linux. If ATI, sorry I mean AMD :), is unable to make good implementation then it’s their fault, not GL’s.

Doesn’t look like it …

Are IHVs going to write a real pure-and-clean GL3 profile that is going to be stable?

There’s no such thing. The only GL 3.0 profile is the one described in the spec (only a GL spec can define a profile).

The thing is that GL 3.0 doesn’t fix any of the problems of OpenGL. It basically punts on that to GL 3.1, since 3.0 deprecates features. Of course, it doesn’t deprecate enough to make writing drivers anywhere near as easy as D3D.

Basically GL 3.0, 3.1, and probably 3.2 will not see any of the promised implementation improvements that Longs Peak was purported to deliver. And, considering the ARB’s complete and total unwillingness to actually fix real problems, I wouldn’t hold my breath on it happening any time soon.

The best you can hope for is that Blizzard and Id strong-arm ATi into writing OpenGL drivers that don’t suck. Unfortunately, it’s more likely that ATi will simply write OpenGL drivers that run Blizzard and Id products, and everything else is basically up for grabs.

In NVIDIA we believe :slight_smile:

With respect to what ScottManDeath posted, that’s what I expected to happen. Yes, nvidia can do a lot of things. They have a lot of confidence.

That blurb from the GL3 page is nebulous at best. Seems to me all they’re trying to do is appease the customers that have a huge stake in legacy code, and the new deprecation system offers vendors a way to do just that while keeping an eye forward.

Deprecation is cleverly designed - single most important thing to come out of GL3, IMHO.

I dont think any “cleaner” or simplified gl version with removed features, etc. can make ATI produce good drivers. They simply dont want to allocate enough resources for their driver team. They dont see it as a good investment.

If the new gl version has less features and require less efford from the driver team, ATI will just cut the budget even more and the driver quality will remain the same [censored].

I dont think any “cleaner” or simplified gl version with removed features, etc. can make ATI produce good drivers. They simply dont want to allocate enough resources for their driver team. They dont see it as a good investment.

It would help.

A cleaner GL API would allow ATi, or other companies, to develop GL implementations with fewer resources. That is, if the current resources that ATi puts into its GL drivers is insufficient, then making GL drivers easier to write would make those few resources more sufficient.

I think that the NVidia support is only due to the CAD market.

They still do not support GLSL on FxComposer and if you go to their developers website, OpenGL information is very little when compared with the amount of DX information they provide.

But they are not alone on this, most of the other IHV are doing the same nowadays.