Well Gurus, how is OpenGL 2.0 coming along?

To be frank, I’m going to be moving over to DirectX. I don’t want to argue about which is best - but I do feel that the ARB should have cracked this nut a long time ago. I am now in a position where I have to start a new project and, once again, decide whether to use DX or GL.

The issue for me is so simple its amazing. I don’t feel I have the time to write classes for different codepaths (VAO\VAR, VA etc.), with DX I can just use a vertex buffer and have done with it. Moreover, having to cover these different codepaths surely increases the chances of bugs in my programs, basically because more code = more bugs.

This isn’t the only issue I am concerned about GL. Although there are lots of libraries around for doing all sorts of stuff, I think one of the most useful things about the DX SDK is the utility modules (.x file handlers, mesh objects, etc.) - this enables me to get up and running much quicker than would be the case with OpenGL.

Don’t get me wrong, I loved coding with GL - but I see it more as a convienience for demo coders\SOTA game writers these days. I don’t need bleeding edge tech for my applications, just a nice comfortable API.

So, finally, will GL 2 be able to give me this? Not soon for sure. Its a real shame.

I hope you remembered to wear your flameproof suit…

– Tom

Not wearing one - I don’t need one. The way I see it, if I need to be convinced I should be using OpenGL over D3D, after 2 years solid experience with GL (and one successful product launch), then something is wrong. Thats why I’ve posted here; perhaps I’m missing the point?

I don’t need to be cross platform - thats the only thing I see as an advantage with GL. I’ve weighed up the pros and cons, used both APIs and have come down in favour of DX for now.

I am a guru. And I say: Opengl code is so pleasent to write and read. It is so beautifully designed. GL2 code will be an artform of its own, go read the (premliminary) specs from 3dlabs.
Those who go with ms & dx will get depressed in the long run just by looking at their own code. Ever compared a unix interface to a windows one?

Dont underestimate the power of aestetics,
GL2 apps will look better because of this.

Well, thats true, but apart from some obscure hardware setup, GL2.0 is just a series of lecture slides.

I envy the D3D gurus vertex buffers (sigh…)

The state of GL right now is pretty good. As soon as the ARB_VAO spec is completed, I see no reason to switch. With the release of ARB_vp and ARB_fp, things look promising. I do agree that multiple code paths are annoying. And fritzlang’s point about the way GL code looks, is important to me too. I’ve used DX5-7 and have nothing against DX/D3D but I simply like GL too much to switch . I like the way it looks and the way it works. GL2 is coming in one form or another ( and hopefully sooner than later ).

[This message has been edited by PH (edited 12-10-2002).]

I am confused. You said you are moving to D3D. And you then said you want to be convinced to stick with OpenGL. You don’t even know what you want…

Hey, it’s quite possible to support both, y’know. It’s a little tricky at first, but once you know both API’s then writing an abstract renderer interface is quite straightforward - you really should be doing this anyway (saves you rewriting stuff)…you have to track render states don’t you? You have to do this no matter which API you use, so put a nice layer between you and the rendering api to keep things tidy. Clever use of function pointers will speed things up.
You’ll always have a choice then - you won’t be at the mercy of microsoft because you can just switch your dev time over to improving the GL version of your abstracted interface. There’s not all that much to drawing stuff - everything does things much the same.

I tried this knackered - however, I ended up having to bend it to suit GL rather than D3D. The main problem was in the storage and handling of vertex arrays. I need to chose vertex formats at runtime - writing procedural geometry which fills the vertex buffers at runtime. So, I ended up with a “vertex buffer” for GL and just the plain old D3D object (with a few bells and whistles - virtual functions falling through to the actual method on this object). I also ended up wrapping up the SetVertexShader functionality too. Basically, I was wrapping D3D but implementing methods for GL - all the while adding one more layer of abstraction and hence, possible confusion for the future maintenance guys.

The point I am trying to make is that, yes, it is entirely possible for me to write a wrapper for both. However, it is quicker and easier just to use D3D. I think things used to be the other way around - it was quicker to use GL and you just couldn’t do certain things with D3D. Well, its flipped right over now.

And, no, I’m not 100% sure about this, just as I wasn’t 100% sure about using GL in the first place back in 2000 when I wrote the feasibility paper for the product I have just completed. GL just feels “messy” these days - from an aesthetic point of view - compared to D3D 8.1. I know all the extensions stuff is great for demo and SOTA - but I think GL has lost it on the Janet and John stuff. You know, the bread and butter

i currently use opengl 1.4 + ARB_fp and i can’t wait for ARB_vao. my code is not messy, and very future proof (at least, i think gl will survive for quite some while).

extensions are not useful if you want easy, clean code. yes. but once you started to really use dx you see that there is nothing different. eighter limit yourself to a low level of dx or simply cut off a huge possible amount of people. use pixelshaders. and voilà, gf3+/radeon8500+ only. most of the effects could be done on a gf1…
use real pixelshaders (ps2.0) and voilà, you loose everyone, except radeon9700 currently… the same situation in wich i am. gl1.4, and espencially gl1.5 will be about equal both in usage and features as dx9 is. so i still see no need to move.

the dynamic vertex format for dx is crap btw. i dislike to only have vertex buffers, too. etc… ARB_vp and ARB_fp are much bether than what dx provides (variables, the automagically bound states, etc…). gl does provide much, dx does have much leaks.

both are great imho. but i see no use to code for dx9 currenly (i would never go to dx8… forget it ). gl1.4/1.5 provides/will provide me the same feature set. with_OUT_ stupid extension mess and all. and still, i will be able to additonally implement speed boosts, quality boosts or additional features if i want per extension. but thats premature optimisation.

dx restricts me away from that.

and my code is compileable on linux,on mac, etc as well. no need to stay on windows.

and no, i don’t need that. but i don’t know, if, one day, i win somewhere a mac, or one day, i’ll choose linux. and then, i can just continue coding as i did before…

gl2.0 will be great…

No, I have to agree with Robbo on this - GL is a real mess from a “getting a project finished before a deadline” kind of perspective.
Davepermen, you say that DX has limits too, and that you constantly have to check for certain feature support…the big difference is, however, that if a feature is available in DX, the interface is the same no matter what the vendor. I’ve harped on about this before, I know…but it’s pretty relevant.
Davep, am I right in thinking you’re not currently employed in the graphics industry? In which case, I’m sure you don’t mind that you can code directly down the arb_fp/vp path - but nobody in the real world has cards that support these extensions…so it’s pretty pointless talking about them in this context.
Don’t take that the wrong way, davep - I’ve heard you talk about various graphics topics, and you really do seem to know your stuff - but you’re not the most practical person when it comes to these issues.

You’re “starting a new project”? That means you “finished an old project”? If the old project already uses OpenGL, and has working code, then re-using that code (which presumably is already debugged) is surely easier than trying to write a DX wrapper from scratch?

Also, OpenGL gives you some semblance of portability. That may or may not be important to you.

No jwatte - thats really part of my point - if I use D3D, I will have a lot less code in the engine period and I won’t have to worry too much about specific code-path optimisations.

The kind of classes I would inherit for a new project from my current one include math, mesh, vertex buffers, materials etc. Most of these things exist in one form or another in the D3D SDK by default, only their versions will have been tested by thousands of programmers on many different setups rather than just by me in the office on whatever hardware I can grab hold of during a test iteration. The only thing I will miss is GL picking, but I only used that before because I didn’t have time to implement a proper ray-casting mechanism (which, incidentally, I have done since).

At the end of the day, I am just waiting for someone to come up with the killer argument in favour of GL, either now or in six months time. Don’t forget, I will have to answer the question in meetings at work as to why I am wanting to use D3D now when I recommended GL a couple of years ago! I think I can justify using D3D today based partly on my experience of using GL over that time period. I couldn’t before of course.

Wait so learning direct x will be easier for you than just using GL in this project? Man I wish I could learn how to use graphics API’s as fast as you can…

Tell you what to do…go use DirectX.

Close the door on your way out.

OpenGL 0wn5 j00!

-SirKnight

um surely supporting VAR/VAO/immedate etc is a 7minute job to code up?
in the final development time for a project this equates to roughly 0.000034% (rounded down)

Rumor has it … DX9 tomorrow.

Back to the original question “How is GL2 coming along?”. Can anyone answer this? The last ARB meeting notes, it says that there would be a final spec at the end of 2002.

GL2 working groups has selected glslang as the starting point, and is actively identifying and reducing issues from both the Cg and glslang documents. Still on track for a final spec by end of CY 2002.

Has the December meeting taken place?

Originally posted by knackered:
No, I have to agree with Robbo on this - GL is a real mess from a “getting a project finished before a deadline” kind of perspective.
Davepermen, you say that DX has limits too, and that you constantly have to check for certain feature support…the big difference is, however, that if a feature is available in DX, the interface is the same no matter what the vendor. I’ve harped on about this before, I know…but it’s pretty relevant.
Davep, am I right in thinking you’re not currently employed in the graphics industry? In which case, I’m sure you don’t mind that you can code directly down the arb_fp/vp path - but nobody in the real world has cards that support these extensions…so it’s pretty pointless talking about them in this context.
Don’t take that the wrong way, davep - I’ve heard you talk about various graphics topics, and you really do seem to know your stuff - but you’re not the most practical person when it comes to these issues.

yes, exactly

but two things:
a) in gl extept for ARB_vao its all the same, for major features (dx does not provide more than the ARB… gl does, if you want to use that additonally => i use gl, so i can code both in an easy, standard way, and use the additional advanced per hardware features if wanted/needed)
b) yes, i currently don’t work in the industry. my plan is on getting my stuff working over the next 2, 3 years, the same timeframe robbo had for his last project. in 2 - 3 years dx9/gl1.4/1.5 capable hw is a standard wich is supported by the higher level gaming society. and they are my target… i wanna use the power of a moder cpu and gpu/vpu to get something really sweet running. at the start of doom3 no hw was planned to run it at really acceptable fps anyways, too…

and i can code fallbacks to older hw if needed. but thats premature optimisation. first i wanna see my stuff working. then i wanna see my stuff working everywhere…

but sure, currently i would use dx for a project, too. but when you start a project, i would use gl again now. as it evolved through its mess thanks to the fancy proprietary extensions it got through nvidia and co the last years.