ObenJL

All what OpenGL needs is better implementation, or unified implementation, and let the IHVs develop drivers only, like in DirectX and Mac. Other than that, it’s the best.

Ppl who want to deprecate glBegin/glEnd and other features such as selection/feedback, display lists…are likely in favor of D3D and want to eliminate such decent features, that are easy to implement on top of VBOs and modern hardware, because it’s the heart power of OpenGL.

GLSL is far more than perfect and makes much more sense than HLSL. If I cannot declare varying params in HLSL the way I define my custom params in GLSL…

Apart from that, D3D is going through a lot of rapid changes which affects the stability of D3D state. D3D9 is good, but to upgrade to 10 or 11 which demands certain class of HW and only one version of an unsuccessful OS…gimme a break.

Game developers…please when you switch to OpenGL, then IHV whose GL drivers suck will consider more effort to improve their drivers and make them bug free otherwise the customer will only see the graphics card deficiency not the driver :slight_smile:

Have fun!

when you switch to OpenGL

No game developer is switching to OpenGL. Those who have some particular need to use GL (Blizzard for Macs, Id for Linux) will do so. Everyone else uses D3D and will continue to do so.

I’m switching to OpenRaincoat - the API that invites everyone to have a peek.

No game developer is switching to OpenGL

How do you know?

Everyone else uses D3D and will continue to do so.

Why do you insist?

How do you know?

Do you have any evidence that GL 3 meant anything to non-Id/Blizzard game developers?

Why do you insist?

Because it’s true.

Do you have any evidence that GL 3 meant anything to non-Id/Blizzard game developers?

No.

But even GL 2.1 is competitive to D3D9 on features basis, regardless of implementation problems.

Do you have any evidence that GL 3 meant anything to non-Id/Blizzard game developers?

Is there any chance that GL will mean anything to game developers?

Is there any chance that GL will mean anything to game developers?

It could have. If LP had been delivered as promised, on time, and with good quality implementations. It would have offered a reasonable alternative to the DX9/DX10 split. But every month that goes by sees the adoption rate of Vista going up, and therefore the use rate for XP going down. Every month sees more and more reasons for game developers to opt for a 2-path DX9/DX10 solution.

But of course, the ARB royally screwed that up.

For current and future projects, from games to CADs, what would be the ideal and wiser solution assuming the only target platform is whatever Windows version?

If we go D3D, then 2 rendering paths needed one for each Vista and XP using D3D10, and D3D9 respectively, which is a burden.

The second option is to go GL and that works for both Vista and XP, hoping the driver will not fall at some point, which is very unlikely, and if it does, then it’s the hardware implementation not the GL specification itself.

Almost everything in D3D10 covered in GL 2.1 and with extensions, we don’t even need GL 3.

In the future, you can use DX10 Level 9 rendering to access DX9 hw through the DX10 API.

DX11 will offer the same, and will be supperted on Windows 7 and Vista.

Currently, I would pick D3D9, though if I have to support Windows XP.

It isn’t such a big problem to use both APIs in one program. But XP support will drooped one day and for Direct3D 9 hardware we will get 10 Level 9 in Direct3D 11. So we are back to one Direct3D API for every hardware.

For most customers it would be your “bad” game that doesn’t render correctly. They don’t care about bad drivers or whatever. It is your software and therefore it’s your fault. Therefore most developers chose a way that reduces the risk to be blamed for mistakes from others.

OpenGL is a pain in the butt and today i wouldn’t recommend it to anyone, anymore. The only reason to use it is that other widely used OS (at universities and maybe for some CAD). Oh, and maybe the other other OS, but i don’t see that one used for anything that needs fancy hardware accelerated 3D graphics (though that might only be from my ignorant point of view).

Jan.

I see now. OpenGL is a big pain in the butt. And I hear this from the GL community itself.

Ok then. I give up and switch to D3D.

I prefer GL, and don’t find it too hard to tackle. (though, I may be toughened by years of painful development for extremely buggy mobile OSes and 6+ code-paths for each game/app).
The way I wrap GL looks like DX9/DX10, though. And I prefer semantics in shader-code to the pure GLSL style. (uploading all uniforms in one call, and allowing for precompiled shaders on nV hw).
I simply like the dual FIFO.

The current Valve Steam Hardware Survey reports still very similar results to a year ago, 80% of users still using XP and 20% of GPUs being DX10 able (includes XP and Vista users). That includes over 1 million samples, so I’d say the adoption of Vista is still quite a ways off, unless anyone else has some good numbers to say otherwise.

As for game developers : I prefer GL for my personal work (indie development), and at work (I’m an AAA game developer doing cross PC/360/PS3 development) chances are that if GL3 gets good driver support (from ATI) and some ability to cache compiled shaders locally on the machine, GL3 is a serious contender to dethrone a DX only interface. There are some massive advantages of GL3 over DX9, and with Vista adoption long off still, GL3 has the opportunity to do great things in getting closer to what we have access on the consoles (such as sampling depth, better AA support for MRTs, batching of uniforms, access to DX10 like functionality which is available on 360, etc, etc)…

But is not D3D is more and more approaching GL in every release. Take the context idea for example.

What’s your point?

Almost everything in D3D10 covered in GL 2.1 and with extensions, we don’t even need GL 3.

I found entertaining that the extensions released when GL3 was release were all written against 2.1.

The only reason to move to GL3 is if you are required to program to a standard, meaning you can’t get away from coding to NV-only extensions. However, that still assumes ATI will fully support GL3 (the drivers still aren’t out yet). But even those will be lacking if they don’t also include the extra extensions for D3D10 features not rolled into GL3.

It’s a mess.

Deprecation by itself does nothing except aid in future compatibility. But there’s nothing wrong with programming against deprecated features. That’s a given, otherwise there wouldn’t have been industry outcry (hidden from public in the secret ARB cave) against breaking compatibility.

Profiles also strive for various types of compatibility. The most useful profile I could imagine would be the debug profile. But even so, we have yet to see what they might do with this.

Profiles and deprecation don’t help a developer overcome API limitations or deficits. And the API itself hasn’t evolved much in the last decade. (I’m claiming that function additions aren’t the API evolving but are added bloat begging for some re-architecting.)

the drivers still aren’t out yet

Neither are nVidia’s; they’re still in beta.

I’m claiming that function additions aren’t the API evolving but are added bloat begging for some re-architecting.

To be fair, adding functions doesn’t create bloat; it’s the lack of simultaneously removing the old functions that creates bloat. And rearchitecting is what you need when old code forces you into bad decisions.

The most useful profile I could imagine would be a performance profile that prohibits software fallbacks and only includes the fast path.

Back in august they said they would have full GL3 support including GS and instancing in Q1 2009.

I am very happy with many of the new features in GL3 and will be making good use of VBO streaming and integer support.
The missing scheduling and synchronization support looks like it is going to be included as part of OpenCL.
The shared CL/GL buffers will mean we can do physics calculations on the GPU and then render the buffers without having to copy the data to the CPU then back to the GPU as is the case now.

The main thing that everyone is waiting for now is Immutable State Objects and decoupling textures from filtering.
Hopefully these (or something similar) will be included in 3.1 which is due around Feb. next year, and OpenGL can take its rightfull place as the worlds best graphics API.

If they also add cubemap arrays, individual render target blend modes, Scatter/Gather4, compiled shader caching, improved Multithreading support, and especially Tessellation, then i will be absolutely ecstatic.

The only thing really worrying me at the moment is that we still havn’t had a new issue of pipeline even though they said it would be returning to a regular publishing shedule.
They have obviously been very busy with OpenCL, but once that is wrapped up then hopefully we will get some 3.1 news.
In the meantime a pipeline about OpenCL and how it interacts with OpenGL would be most appreciated.

Hopefully these (or something similar) will be included in 3.1 which is due around Feb. next year, and OpenGL can take its rightfull place as the worlds best graphics API.

Except for all of the other pain points (inability to separate vertex and fragment shaders, the VAO string thing, any of the other things that have been discussed on several threads in this forum, etc) that better APIs don’t have.

They have obviously been very busy with OpenCL

No they haven’t. OpenGL doesn’t interact with OpenCL. It is OpenCL that provides the interacting API, so the OpenGL ARB hasn’t been involved in OpenCL.