PDA

View Full Version : No GL3 and no newsletter yet...



Pages : [1] 2

k_szczech
01-07-2008, 07:43 AM
First of all - this topic is NOT meant to be another discussion about upcoming features, DX10 vs GL3 comparison or Windows Vista discussion, so hold your horses, will ya? ;)

This topic is meand to remind that someone, somewhere wishes to know something more on OpenGL 3.0 specification (implementations?) progress. I'm not expecting to know any exact release date or some new feature descriptions. I just whant to know where are we now. :confused:

We haven't got a newsletter in ages, and it's pretty quiet. As I once said - I hope it's just a silence before the storm.

If anyone can provide us with some information, it would be aprreciated.

Once again, please do not fill this topic with speculations and off-topic discussions. Only information please (or some related questions perhaps).

Black Knight
01-07-2008, 10:12 AM
I hope we get some information in january.
I'm tired of visiting opengl.org each day and finding nothing about OGL3.0.No newsletter no announcement nothing.It's well a little disappointing but I'll keep on waiting.

bobvodka
01-07-2008, 12:09 PM
heh, ditto... every now and then I load up the main page, glance at it and move on :D

Seth Hoffert
01-07-2008, 02:36 PM
Same here. I get excited each time I load up the page, expecting there to be news of GL3. :)

zed
01-07-2008, 02:45 PM
im wondering if the silence is cause theyre being held up by something (perhaps a version of the d3dx library)
whilst its great to release everything in one great big shebang (itll prolly be to much to digest at once), i believe its better PR to release a bit at a time, the lack of comunication is pretty weak as well

elFarto
01-07-2008, 03:35 PM
I think as we haven't heard from them in a while, that Microsoft has dispatched ninjas to take them out before they can release a DirectX 10 crushing API.

Regards
elFarto

knackered
01-07-2008, 05:57 PM
No, Cass from nvidia was implying in mid-december that it won't be long now and is being actively worked on. Whilst not being anything official, it's still encouraging to me. Without that nugget I'd be very sceptical now.

k_szczech
01-08-2008, 06:11 AM
Yes, I know. Still, a newsletter would be nice. We were supposed to get one every season and the whole thing suddenly and silently collapsed.
It would be good to know where are we standing now and where are we going to. Nothing detailed or specific is required. I guess something like that: "We're working on GLSL", "We're working on drivers" :cool:, "We're blackmailing Microsoft to support GL3 on XBox360" :D would be enough.

Of course we do not wish to sound like Donkey asking "Are we there yet?" every 5 minutes. I realize that asking won't make things go faster and we have to wait anyway. I'm not expecting any promises either.

V-man
01-09-2008, 01:09 AM
I'm guessing they want to deliver stable drivers and announce the spec, documents, drivers all at the same time.
Don't worry guys. This isn't Feihrenheit.

Zengar
01-09-2008, 01:41 AM
I think you are being overly optimistic, V-man. I don't believe we will get decent drivers until at least half a year after a spec release. IMHO, something went terribly wrong there and they are redesigning a large portion of the API.

knackered
01-12-2008, 07:32 AM
*bump*
Come on ARB, give us some feedback - this is getting beyond the joke.

Timothy Farrar
01-12-2008, 12:22 PM
I'm guessing they want to deliver stable drivers and announce the spec, documents, drivers all at the same time.

Sure would be nice if this was the case. You would think that to make hard decisions in the production of a new API, that you would have to do some kind of performance testing with real prototype drivers.

At least the Linux drivers should be damn easy.

If you think about it, at least for NVidia's drivers, the hardware interface is probably 100% or 99% the same as the GL2 stuff that is already functioning right now (GL2 SM4.0 is only missing a few DX10 features). Everything will probably just be a higher level API change in some GL3 interface library. So perhaps a working GL3 could be done fast once the spec is finished.

In any case I'm not holding my breath.

About the only thing that excites me about GL3 is that vendors (again other than NVidia) might actually have working GL drivers which allow you access to features in the hardware which have already been shipping for some time now.

Brolingstanz
01-12-2008, 01:16 PM
Maybe they want to wait and see if Punxsutawney Phil sees his shadow this year...

ebray99
01-13-2008, 05:16 PM
Maybe Punxsutawney Phil will set his shadow map to the wrong compare mode and we'll get an early GL3. =P

CrazyButcher
01-14-2008, 03:41 AM
Maybe Mr.T hax0red his way into the group and added random glDrawMohawks.

k_szczech
01-14-2008, 06:04 AM
glDrawHohawks? :D

I think it's the glReleaseSpecs function that threw an exception and crashed the whole system. Someone needs to go there and hit the reset button. :D

Don't worry. I've heard somewhere that "no news is good news" :p

Zengar
01-14-2008, 06:06 AM
Maybe Mr.T hax0red his way into the group and added random glDrawMohawks.

So, you have seen that WoW commercial too ;)

V-man
01-14-2008, 08:16 AM
search on youtube fool!

Jan
01-14-2008, 08:42 AM
Maybe they silently dropped OpenGL 3.0 and fear to tell us...

k_szczech
01-14-2008, 09:12 AM
Maybe they silently dropped OpenGL 3.0 and fear to tell us...
I thought they rather dropped something on their data storage (you know, the one with all specs on it) and fear to tell us...

Shall we make jokes now to keep this thread alive, so eventually someone from ARB will respond to it?

This is exactly what I didn't want - it was supposed to be a short topic - a few simple questions and a few simple answers. I guess, since we have no answers, frustration brings out the worst in us. :p Or maybe it got the best of us.


Come on ARB, give us some feedback - this is getting beyond the joke.
If we don't get an answer this time, we'll start another topic in 2-3 months. "Beyond the joke" seems suitable name for it. ;)

EvilOne
01-14-2008, 11:52 AM
This gets a really stinky feeling... somehow reminds me of the render-to-texture show the ARB gave in the past. It was really entertaining for some years, but at the end it was just a big shame. Let's see how the GL3 show will perform. High quality entertainment for years?

Hey ARB, don't forget, the API is not here to end in itself - it's for us developers. Yes right, thats the people on the other end, that have to finish products, thats the crowd out there waiting for some updates.

A little hope of mine is that they throw away the GLSL stuff and instead design a binary token stream like the D3D9 shaders. I get the feeling that some simple p-code is better to optimize. The optimizer could be at the driver level, but I have a bad feeling about the compiler beeing in the driver, and ATi has proven with its GLSL support that I am right. And after some time, we maybe have the choice between different languages, compilers, etc. What a dream. Give us binary tokens!

Hopefully they get something out a month before Breakpoint 2008.
Ironic mode off.
Thank you for nothing.

Jan
01-14-2008, 02:54 PM
There is a difference between being ironic and being rude.

Though, of course we all feel similarly.

Jan.

knackered
01-14-2008, 03:33 PM
Give us binary tokens? We'll discuss binary tokens and other new features when they've got the GL3 re-factored API out the door. We want a clean, stable GL implementation for now.
I tell you, if D3d supported quad-buffered stereo and gen-locking I'd have moved over to it long ago.....
Meanwhile, back in purgatory.....

HenriH
01-14-2008, 08:51 PM
Count me in.

Timothy Farrar
01-14-2008, 11:29 PM
Seriously, do you all really think the delay is from the Khronos group. Personally my wild speculation chips are on the delay being a vendor disagreement, which for good reason is not going to be made public.

Also EvilOne, if the GPL3 compiler isn't driver side then I ship you a beer!

Korval
01-15-2008, 11:25 AM
which for good reason is not going to be made public.

Actually, if there is a vendor dispute, it would be best to make it public. That way, whomever is part of the dispute can be publicly upbraided for holding up the spec.

The easiest way to end disputes like this is to embarrass the party responsible for it.

knackered
01-15-2008, 12:05 PM
Embarrassed people usually leave the room.

EvilOne
01-18-2008, 05:14 PM
Just to bump this thread up :-D

Timothy Farrar
01-19-2008, 11:37 AM
Also EvilOne, if the GPL3 compiler isn't driver side then I ship

BTW, this is a little OT, how do you edit your posts?

I cannot seem to find the button to do so .. my GPL3 should have been GL3....

ZbuffeR
01-20-2008, 03:28 AM
Apparently you can only edit posts when no one has posted after.
It can prevent some heavy edits in the second page of a 20+ pages long topic ;-)

Brolingstanz
01-20-2008, 03:39 AM
Yea, so mindeth what ye scribeth err frobbeth thou not hence.

k_szczech
01-23-2008, 06:51 AM
Just to bump this thread up :-D
Just keep'em comming.

Here's little somethin' I previously said I'd rather not do:

"Are we there yet?" :D

Jan
01-24-2008, 03:23 AM
Are we there yet?

bobvodka
01-24-2008, 11:07 AM
No, and if you kids don't hush up I swear I'll turn this spec around and take us right back to 1.1...

:D

k_szczech
01-28-2008, 03:33 PM
I swear I'll turn this spec around and take us right back to 1.1
I'm programming under Windows, so I use 1.1 anyway :P

Actually, you can do a lot with GL 1.1. You can actually implement shadowmaps using just two extensions: multitexturing and GL_ADD texture env mode. Hell, I bet you could do it without them, too :)

It's just that it's so much easier and faster with shaders :D

bobvodka
01-28-2008, 04:15 PM
picky picky picky :P

-NiCo-
02-02-2008, 04:30 AM
Come on Punxsutawney Phil (http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=232705#Post232705)! You can do it!! :)

Brolingstanz
02-02-2008, 10:17 AM
This just in from Gobbler's Knob... 6 more weeks of winter.

What the heck does that furry mutt know anyway?

k_szczech
02-05-2008, 04:54 AM
Err... What was I about to say?... I know it was something smart. If only I could remember...

Ah, yes! I was about to ask if we will get 3 newsletters at once? (Summer, Autumn and Winter editions) :D

Maybe Gl 3.0 is soo good, they have to wait until 2010 before releasing it, but they keep it secret, so Microsoft won't throw all it's resources to create DX11 by that time?

ZbuffeR
02-05-2008, 05:00 AM
Nvidia is currently busy acquiring Ageia :-D

-NiCo-
02-05-2008, 05:07 AM
Woehoew! go nvidia! http://downloads.totallyfreecursors.com/thumbnails/cheer.gif

k_szczech
02-05-2008, 09:18 AM
Nvidia is currently busy acquiring Ageia :-D
And do they need all the manpower they got for it?

Next steps:
-Intel acquires NVIDIA
-Microsoft acquires AMD
-IBM acquires Intel
-Sony acquires IBM
-Microsoft acquires Sony
-No OpenGL Pipeline Newsletter yet
-I acquire Microsoft
-I sell Microsoft to Coca Cola
-ID Software acquires Coca Cola
-ID Software releases OpenGL 3.0 specification

bobvodka
02-05-2008, 10:56 AM
Next steps:
-Intel acquires NVIDIA
-Microsoft acquires AMD
-IBM acquires Intel
-Sony acquires IBM
-Microsoft acquires Sony
-No OpenGL Pipeline Newsletter yet
-I acquire Microsoft
-I sell Microsoft to Coca Cola
-ID Software acquires Coca Cola
-ID Software releases OpenGL 3.0 specification

- ????
- Profit!

knackered
02-05-2008, 11:27 AM
the nvidia driver writers double as accountants now do they? sounds like where I work.

samantha
02-05-2008, 04:58 PM
Oh come on guys, stop this nonsense. Can we go back to serious discussion, please? Thank you.

So, according to my horoscope, this month will bring great change to my life.
I have feeling this is going to be GL3 specification release.

V-man
02-05-2008, 08:20 PM
Nvidia is currently busy acquiring Ageia :-D
And do they need all the manpower they got for it?

Next steps:
-Intel acquires NVIDIA
-Microsoft acquires AMD
-IBM acquires Intel
-Sony acquires IBM
-Microsoft acquires Sony
-No OpenGL Pipeline Newsletter yet
-I acquire Microsoft
-I sell Microsoft to Coca Cola
-ID Software acquires Coca Cola
-ID Software releases OpenGL 3.0 specification


I like that chain of aquisitions. Maybe Mattel will end up owning the entire world.

bobvodka
02-06-2008, 02:23 AM
Oh come on guys, stop this nonsense. Can we go back to serious discussion, please? Thank you.


ok..



So, according to my horoscope...

wait.. what? o.O

yooyo
02-06-2008, 08:27 AM
Whole story about GL3 looks like Duke Nukem Forever:
http://duke.a-13.net/

Zengar
02-06-2008, 08:38 AM
Give them another 5 years and I will agree :)

k_szczech
02-07-2008, 02:38 AM
I like that chain of aquisitions. Maybe Mattel will end up owning the entire world
So Longs Peak will be called Barbie and Mt Evans will be called Ken? No, I wouldn't like that. I'd rather stick to original plan and let ID Software release "William Blazkowicz" and then "DX's Doom" versions.

Lindley
02-07-2008, 07:27 AM
Don't be silly. We all know Taco Bell will own the entire world.

k_szczech
02-18-2008, 08:19 AM
What was the original topic again?

I'm just bumping this thread up in the list again in the same sarcastic way we did so far...

CrazyButcher
02-18-2008, 08:26 AM
we wont get any infos from all those people who have signed NDAs, and who joke on the endless speculations around the absence of information.
maybe GDC brings something up

bobvodka
02-18-2008, 10:54 AM
Lets just say from my look over the preceedings there isn't a great deal about OGL mentioned...

Jan
02-18-2008, 12:10 PM
Indeed, searching for "OpenGL" only brings up two things, that are about physics!?

Well, in 3 days we will know, that we didn't get any infos at GDC.

Jan.

ebray99
02-18-2008, 01:38 PM
Is anyone nervous about this? This is how I currently see things:

1.) OpenGL update not released when it was supposed to be.
2.) No updates on the status of the release.
3.) No information or talks at the GDC about OpenGL.

This is really disturbing to me, and my paranoid side says, "did something big (and really bad) happen over at Khronos? Is the new OpenGL even going to be released?"

The lack of presence at GDC is a huge concern for me. It really makes me wonder what's going on with OpenGL.

Kevin B

knackered
02-18-2008, 02:32 PM
ho hum, we've waited longer in the past for much simpler things.
It is however very suspicious that the one thing that could seriously slow down adoption of vista is showing all the signs of being scuppered. Money talks and s hit walks.

bobvodka
02-18-2008, 03:24 PM
Yes, but in the past we didn't have the ARB promsing to talk more or announcing they are practically there and have a few things to sort out....

I'm not all 'doom and gloom' about it, but as time passes it starts to become less of an issue for me as I find other things to do with my time *shrugs*

Korval
02-18-2008, 04:35 PM
It is however very suspicious that the one thing that could seriously slow down adoption of vista

You honestly think that OpenGL 3.0 would have any effect on Vista adoption?

Zengar
02-18-2008, 07:32 PM
Why slow down adoption of Vista? It's the best OS MS ever made...

*ducks and runs away*

knackered
02-19-2008, 02:59 AM
You honestly think that OpenGL 3.0 would have any effect on Vista adoption?
Yes. A significant number of early adopters are gamers wanting dx10 graphics. Hard to believe, but you should see the garbage being talked about "dx10 graphics" on the gaming sites and magazines.

Jan
02-19-2008, 03:25 AM
Well, if OpenGL 3.0 was released two years ago, such that current engines would make use of it and thus allow DX10 features on XP, Vista adoption (among gamers) might indeed be slower.

But no matter whether OpenGL 3.0 will be out in a week or in six months or even later, it won't have any effect anymore.

Jan.

knackered
02-19-2008, 03:31 AM
Then microsoft's delaying tactics have succeeded.

bobvodka
02-19-2008, 03:55 AM
what has MS got to do with this? o.O

knackered
02-19-2008, 06:33 AM
Nothing bob, nothing at all.
Are you looking forward to the easter bunny delivering your eggs?

bobvodka
02-19-2008, 06:45 AM
...

now how about you answer my question as to what delaying tactics?

knackered
02-19-2008, 06:54 AM
ok, you appear to have a sense of humour block.
it's just conjecture bob, there's no evidence of delaying tactics, just as there's no evidence for any other suggestion made in this thread. It's a guessing game, but microsoft have more motivation to delay the spec than anyone else. If I were them I'd be delaying it, just as if I were a multi-millionaire paedophile like Michael Jackson I'd build a fairground in my garden.

k_szczech
02-19-2008, 07:22 AM
What was the original topic again?
It was supposed to be a short and perfectly serious topic, but that didn't work.
Probably because there was nothing to say in this topic. Nothing at all.

So now we're pretty much off-topic.

So sense of humor is the basic requirement for anyone who wants to post in this thread. We gonna keep posting nonsense untill we produce so many posts that we all rise to ranks of "OpenGL Gurus" and "OpenGL Lords". Maybe then we will be taken more seriously.
Right now there's not much to talk about - no new extensions recently and no new OpenGL version.

So back to pointless talk:

Don't be silly. We all know Taco Bell will own the entire world.
No they won't. The'll never buy the ACME corporation.

Hampel
02-19-2008, 08:03 AM
Right, because it has already been bought by Mr. Burns...

bobvodka
02-19-2008, 09:03 AM
ok, you appear to have a sense of humour block.
it's just conjecture bob, there's no evidence of delaying tactics, just as there's no evidence for any other suggestion made in this thread. It's a guessing game, but microsoft have more motivation to delay the spec than anyone else. If I were them I'd be delaying it, just as if I were a multi-millionaire paedophile like Michael Jackson I'd build a fairground in my garden.

*sigh*
My sense of humor is just fine, however when someone mentions 'delaying tactics' with regards to a company who have nothing todo with the OpenGL spec having pulled out of the ARB some time ago AND given the previous post were about Vista asking what MS had to do with it would seem a logical question... now, if you had just said 'nothing, it was just a (poor) joke' then no problem, as it was you posted that and THEN followed it up with a remark designed, imo, to indicate I was living in a fantasy world of a child and would never suspect MS of such a thing, thus my request for clafication.

All this proves is that 'joking' without smilies or any of the other clues which you get IRL is just a bad idea...

knackered
02-19-2008, 09:23 AM
*sigh*
Bob, the fact the microsoft is not a member of the ARB is irrelevant. They basically have nvidia, ati and intel by the balls. They have us all by the balls, truth be known. If microsoft wanted to delay the GL3 spec, they could without any difficulties at all. I just proposed that they had, as a possible explanation for this otherwise unexplainable delay in a spec (let alone implementations).

k_szczech
02-27-2008, 04:00 AM
Yeah, I know we're way off topic and I don't mind, but why is it that every time we go off-topic somebody mentions Microsoft? :P

Brolingstanz
02-27-2008, 03:43 PM
I'd rather plunge head first into a vat of whale dung than have another tit-for-tat Microsoft conspiracy theory debate (unless complimentary cheese and crackers are served).

Though I suppose it's true... MS could have us all hopping around on one leg, if they were so inclined.

PaladinOfKaos
02-27-2008, 03:50 PM
Concievably, MS could be doing the same thing they're trying to do to Linux - claiming they have patents, then not saying exactly what those are. Khronos probably takes those sorts of threats more seriously than Linus Torvalds, so they might be doing a search of MS's entire patent portfolio in order to make sure everything's in the green.

And for the record, I don't think that's actually happening.

knackered
02-27-2008, 03:56 PM
And for the record, I don't think that's actually happening.
No?
http://news.bbc.co.uk/1/hi/business/7266629.stm
They are pretty underhand y'know.

Mapping a lightweight API onto existing DX10 driver entry points shouldn't take this long.

EvilOne
02-28-2008, 11:31 AM
Maybe it's because one of the big CPU manufacturers spend all resources fixing a TLB bug and don't have any time for other things. *hehehe* Or maybe it's the other CPU manufacturer torpedoing GL3 because of their cappy on-board accelerators and drivers. *hompfh*

Hmmm, as regards to my first post I must say, the render-to-texture show was more entertaining... at least, from time to time there was an update. Okay, got somthing to do... starting Duke Nukem Forever install.

Korval
02-28-2008, 11:40 AM
the render-to-texture show was more entertaining... at least, from time to time there was an update.

I sat through the "render-to-texture" show, and it was not more entertaining. It was quite a bit worse.

Here, at least we had a full year of information, in 4 newsletters. There, we had maybe a couple of presentations, at most. Here, it's a totally new API; there, it was just figuring out a way to render to a texture. Here, we still have GL 2.1 to use in the meantime; there, we had pbuffers, which cannot reasonably be called a RTT solution.

The "RTT" show was 10x more embarrassing for the ARB and much more infuriating for those who were waiting on it.

EvilOne
02-28-2008, 12:05 PM
Please cool down :-D I was just kidding.

The problem I have with current GL2 is that I spend 20% of my time programming, and 80% of my time in trial and error sessions how a specific driver and card combo behaves...

Take buffer objects we got since 1.5. I have the strong feeling that someone in the ARB thought: "Okay, let's take a look at D3D... they have working static and dynamic arrays for years, the vendors have working code in the drivers - so let's build something completely different and reinvent the wheel." Rendering from buffers and laying out buffers is currently guesswork.

Another sucker are those damn software fallbacks. Guesswork again. Okay, we have render to "something" now... but there is no chance to find out exactly what works, and what works not.

At least for me the guesswork is over... I'm currently switching to D3D. Will give GL (and multiplatform *sniff*) another try in the next engine version, so maybe in three or four years. [censored] happens.

knackered
02-28-2008, 01:28 PM
if I could get quad buffered stereo from d3d, I'd be right with you evilone. Like a shot.

Korval
02-28-2008, 02:09 PM
"Okay, let's take a look at D3D... they have working static and dynamic arrays for years, the vendors have working code in the drivers - so let's build something completely different and reinvent the wheel." Rendering from buffers and laying out buffers is currently guesswork.

You're looking at the problem from the wrong angle. The problem isn't buffer objects. It's the mechanism by which they are used. The decision was made to make VBO code resemble as closely as possible regular vertex array code. So they didn't create a new set of entrypoints for setting up vertex data (unlike ATi's Vertex Array Object extension). This had the benefit of being able to automatically work with new vertex pointer calls like glVertexAttribPointer.

But it had the problem of forcing an IHV to basically accept, at a low level, any arbitrary set of array bindings. It also forced IHVs to do a lot of setup code on gl*Pointer calls, which would have been more reasonably implemented as an object to store them.

So the problem is mostly that the legacy API kept tripping up what the ARB really wanted to do. Which is what GL 3.0 is all about dealing with.

EvilOne
02-29-2008, 12:41 AM
Yeah, compatibility at every price. But I were more happy with writing a second code path for rendering from buffers, instead of having the current situation.

Somehow it seems, that D3D is draw-call-bound, and GL is now setup-call-bound.

And by the way, a little wish I have for GL3. I'd like to have the Direct3D9 DrawIndexedPrimitive call emulated:


HRESULT DrawIndexedPrimitive(
D3DPRIMITIVETYPE Type,
INT BaseVertexIndex,
UINT MinIndex,
UINT NumVertices,
UINT StartIndex,
UINT PrimitiveCount
);

In particular, BaseVertexIndex and StartIndex is what I have always missed. So this would help in setup cost somehow. The D3D version of my landscape renderer looks so elegant... Bind once, then draw, draw, draw... Maybe, I just have overseen the similar function in GL (my multiplaform backend ambitions were a bit halfhearted).
:-)

Dark Photon
02-29-2008, 06:12 AM
by the way, a little wish I have for GL3. I'd like to have the Direct3D9 DrawIndexedPrimitive call emulated:


HRESULT DrawIndexedPrimitive(
D3DPRIMITIVETYPE Type, INT BaseVertexIndex,
UINT MinIndex, UINT NumVertices,
UINT StartIndex, UINT PrimitiveCount );
In particular, BaseVertexIndex and StartIndex is what I have always missed....Bind once, then draw, draw, draw... Maybe, I just have overseen the similar function in GL
I believe you have a StartIndex analog in OpenGL. Just bind your vertex attribute and index VBOs, then pass the appropriate index start offset to "indices" in glDrawRangeElements (http://www.opengl.org/sdk/docs/man/xhtml/glDrawRangeElements.xml) (0, 128, etc.). For those like me that don't do D3D, the idea here is you have the indices for a bunch of batches stored in the same index VBO, and you want to issue multiple batches without rebinding VBOs.

However, there's no BaseVertexIndex analog AFAIK (i.e. a value that gets added to every index that is fetched). Only useful when you don't know what offset your vertex attributes are going to be from the start of their VBO when you cook the index list for the index VBO -- rarely true, but I could see this potentially being useful for streaming into pre-allocated VBOs. In that case, just add N to your indices when you store them. Simple and cheap.

The MS canonical docs on this API (http://msdn2.microsoft.com/en-us/library/bb174369(VS.85).aspx) are the pits, but this blog post (http://blogs.msdn.com/jsteed/articles/185681.aspx) does a good job of explaining what MS didn't.

ruysch
02-29-2008, 09:28 AM
Where is the binary shader interface, why do we have to recompile every freaking shader every time!

Where is the clean interface allowing us to set states inside shaders.

Where is the hardware queries for vcache usage, bandwidth used, etc?

Where is the shader precompiler ?

Where is the support for half floating-point (Yeah it was exposed once by NVIDIA but to conform with the *standard* they removed it)

Where is the GL version of D3DX.

Where is the GL version of DXUT (and dont say free glut, glut ...)

The problem with OpenGL is that its a standard which in it self isnt worh much, its the tools, tools, tools.

There was a time where a company didn't see the potential in the mass-consumer market (SGI) and now there is a standard which hasn't discovered that standards are there for people to use them, and for people to use it they need tools.

The problem with the people, (including my self), devoted to OpenGL is that every time the ARB board/SGI/Kronos f*cks up (sorry about the language ;-) we still whine about Microsoft and DirectX.

But what about ATI or NVIDIA, look at the tools they make for DirectX and then take a look at what they offer for OpenGL.

My advice to the ARB board, starting up your favorite editor and keep coding like us ;-) Because we don't need any more standards we need tools! We do not need any more excuses about evil Microsoft who obviously is doing a much better job than we do ...

CatDog
02-29-2008, 09:56 AM
Can somebody recommend a good book, that describes D3D from an experienced OpenGL users point of view?

(This ranting seems to be fun - but I need more input, before I can participate.)

CatDog

ruysch
02-29-2008, 10:04 AM
Book, that would cost you money ...

Just download the SDK, thats the proper way to learn DirectX.
http://msdn2.microsoft.com/en-us/directx/aa937788.aspx

There is also a free online draft on a DirectX9 book.
http://www.xmission.com/~legalize/book/download/

You could also start with XNA,
http://forums.xna.com/

Roderic (Ingenu)
02-29-2008, 11:42 AM
...and now there is a standard which hasn't discovered that standards are there for people to use them, and for people to use it they need tools.


Or maybe it does, and that's the source of the problem.
Ever noticed that OpenGL is not ONLY for games ?
In fact it's usage is way wider in other areas than in gaming, which raises the question : Who is OpenGL for ?
(Democracy would mean the game industry would have little weight in decisions about OpenGL, since it's a minority.)

EvilOne
02-29-2008, 11:52 AM
Maybe someone want to elaborate about the journeys of using GL under OS X. I have read, that GL under OS X is mostly implemented by Apple (so to say, the D3D way)?

Korval
02-29-2008, 12:03 PM
Where is the hardware queries for vcache usage, bandwidth used, etc?

That belongs in a platform-specific profiling system, not a platform-neutral graphics API.


Where is the support for half floating-point (Yeah it was exposed once by NVIDIA but to conform with the *standard* they removed it)

... and? Half-float was a temporary measure that nVidia needed because their GeForce FX hardware sucked.

Also, this isn't a tool.


The problem with OpenGL is that its a standard which in it self isnt worh much, its the tools, tools, tools.

Nonsense. Give me a performant API that doesn't have pitfalls and has implementations I can trust, and I'll take care of the external tool stuff. I'm a grown-up programmer; I don't need any particular hand holding.

Tools aren't going to make it easier for me to figure out which kind of vertex formats hardware supports in code. Tools aren't going to make ATi's glslang compiler work. Tools are meaningless if you can't even trust the API that those tools depend on.

ruysch
02-29-2008, 12:14 PM
I reckon then everything is just sweet candy.

Look at XNA from Microsoft, now thats a platform which is going to have long lasting consequences on the flow of programmers into the DirectX camp. Look a Game studio again from Microsoft.

Look at the content pipeline provided with XNA, but I reckon tools aint usefull....

Look at at Mircosoft PIX, or NVPerfHUD, again profiling is overrated....


I mean be honest, take a look at the *OpenGL SDK*
http://www.opengl.org/sdk/

Is that what OpenGL has to offer the next generation of graphics developers?

knackered
02-29-2008, 12:53 PM
lol! do you really think professional programmers use XNA Game Studio??
PIX and perfhud are great tools, but that's about it for useful things in DX. Everything else you mentioned is for amateurs/hobbiests.
The XNA framework itself is pretty pointless in my opinion. If I'm writing a game for multiple consoles, what have I to gain from locking the PC and 360 versions to some win32 specific bloatware? I'll have to come up with a tool chain for the other platforms anyway, so why would I bother with the added complexity? The answer is of course that I wouldn't, I'd either write the tools and platform-specific code myself, or I'd use some tried and tested middleware such as unreal or renderware.

Zengar
02-29-2008, 01:25 PM
I absolutely agree with knackered and Korval (hey guys! for the first time you actually _say_ something similar :) ) --- D3DX, GLU and things like that are toys and are not needed at all besides for small demos, as the required functionality is coded within one-two days by more or less proficient coder. Actually, you don't need an "SDK" at all (actually I never understood what this SDK thing is about? Usually when people talk about "SDK" they seem to mean API bindings). We don't need API-specific tools, we need a good API first. The tools will come with the time.

bobvodka
02-29-2008, 02:14 PM
I wouldn't have said they 'arent needed at all'; while they aren't as much use to big software houses and people who know what they are doing having something like that to kick start your learning of the API can be and IS helpful.

knackered
02-29-2008, 02:16 PM
An up-to-date and easy to implement API is perhaps a little bit more important, however. Which brings us back to this ridiculous delay in the GL3 spec.

Zengar
02-29-2008, 02:20 PM
Well, I mostly said "not needed" to make some dramatic impression. I hope you all understood what I meant --- it isn't so much different from what knackered and Korval (just to name a few) were saying.

knackered
02-29-2008, 02:40 PM
I think bobvodka's struggling with the english language, specifically the difference between 'needed' and 'helpful'. Yes an SDK would be helpful to novices, but no it's not needed by the main people a low-level rendering API is aimed at - i.e. people who use it to make money or to help in research projects.

pudman
02-29-2008, 03:31 PM
Where is the hardware queries for vcache usage, bandwidth used, etc?

That belongs in a platform-specific profiling system, not a platform-neutral graphics API.

If the hardware queries are directly related to the graphics API, the performance of said API, or configuration of said API, might the API provide even a simple extensible mechanism by which IHVs could provide those capabilities?

Or maybe this begs for an OpenGP... "Open Graphics Profiler" API that would happily coexist with GL. Then again, I believe there was some mention of a GL3.0 debug profile which may be what people actually need.

Maybe because Kronos is (possibly) redoing GL3.0 they'll skip v3.0 and head right to v5.0. That might make D3D coders more at ease... skip those versions that we all know will suck. I have high hopes for GL10.x!

Ilian Dinev
02-29-2008, 04:15 PM
My only wish from GL is that writing to the FIFO doesn't take thousands of cycles. glBindBufferARB-2k cycles, gl*Pointer - 2k cycles,...
It would be great imho if nVidia and ATi provide some closed-source driver/interface for each type (or just for at least one-two) of their most recent cards. It'll be just few cards: the GF7200, GF7x00, GF8x00, GF9x00, R1x00, RHD2x00. It's things we can handle easily with wrappers and allowing to pin-point optimize for some cards. This probably will open-up space for some more things that can be done in a frame, or simplify some designs. A GL/DX wrapper will still be necessary, but I'd rather code more than spend most of the time in trying to see how specific cards and drivers behave.
With such a low-level interface, windowing and vram resources will be problematic, but I don't think it'd be too hard to be managed either by the driver or by ourself.

Or simply GL3 should finally come >_> ... and it'd better be lightweight.

We don't need tools, just a way to compile and bind shaders, set uniforms, queue-up vertices with attributes from vram/sysram, draw points/wide-lines/tristrips/indexed_tris, do occlusion queries, use RTT and R2VB, tweak basic hw facilities (culling, depth-test, stencil), and present onscreen - all being as fast and pipelined as possible.

Korval
02-29-2008, 04:17 PM
If the hardware queries are directly related to the graphics API, the performance of said API, or configuration of said API, might the API provide even a simple extensible mechanism by which IHVs could provide those capabilities?

Hey, IHVs have always been free to add whatever extension to OpenGL that they want.

noncopyable
02-29-2008, 04:34 PM
[quote]... and? Half-float was a temporary measure that nVidia needed because their GeForce FX hardware sucked.

Don't know its origin but a half float is in most conditions alot more than enough, and half memory cost, imo it is another >>must<< for a serious api/language whatever you call.

ruysch
02-29-2008, 04:49 PM
Lets just not hope halfs are in the spec... or what Korval, because
I know half will be in OpenGL 3.0 .....

Komat
02-29-2008, 04:55 PM
[
... and? Half-float was a temporary measure that nVidia needed because their GeForce FX hardware sucked.

Shaders using halfs are usually faster even on GeForce 7 class hw because of dedicated normalization unit and increased latency hiding capabilities.

Korval
02-29-2008, 05:07 PM
Don't know its origin but a half float is in most conditions alot more than enough, and half memory cost

Nonsense.

Half-floats are a phantom. An illusion. For half of the hardware out there (more than that, now that nVidia has largely abandoned them), half-floats don't even exist.

There's no memory savings with half-floats because ATi doesn't support them. They never did. On no ATi hardware will you see a benefit when you use them. And since it isn't something that is cross-platform, it isn't something you can rely on.

Now, maybe it's something you want to test for and swap in different shaders. But it certainly isn't something that ATi or anyone else should be forced to support in any way just because nVidia decided to save some transistors in their NV3x/4x hardware.

Komat
02-29-2008, 05:25 PM
I--- D3DX, GLU and things like that are toys and are not needed at all besides for small demos, as the required functionality is coded within one-two days by more or less proficient coder.

You are greatly underestimating features of the D3DX. There is more in it beyond simple helpers. I would suggest that you read its documentation.

Even the HLSL compiler is part of the D3DX and not of the core runtime. Or the D3DX contains mathematics operations optimized by individual CPU vendors (as far as I know). Also the effect framework which appears to be used by some professional developers is part of the D3DX.

Some features of D3DX are clearly meant for use in build pipeline (e.g. the unwrapper of uv coordinates, functions for calculating coefficients for precomputed radiance transfer). Yes, you can write your own version however why do that if someone else already created usable one.



Actually, you don't need an "SDK" at all (actually I never understood what this SDK thing is about?
DX SDK is about nice package in which you get documentation, headers, latest version of D3DX library and examples.

-NiCo-
02-29-2008, 05:27 PM
If they really choose to abandon halfs, I wish they would at least add a color-renderable GL_RG32F_ARB texture format to get 64 bits per pixel (mainly to save bandwidth in PBO operations).

Right now, the only non-RECT color-renderable tex format with 64bpp is GL_RGBA16F_ARB. It's supported natively up till NV44 that I know of, don't know if it's supported natively on more recent hardware though.

Nvidia's CUDA language still has support for halfs through the CUDA driver API and CUDA is only supported on GF8 and up so I guess it's still supported natively on GF8 hardware...

Please correct me if I'm wrong :)

ruysch
02-29-2008, 05:29 PM
cross-platform what is that exactly.... OpenGL isnt even supported directly under most of the worlds computers, i.e sure XP did support OpenGL 1.1, the word cross platform is an illusion..

Halfs aint the hardware has it, or you havent used 16 bit floating point blend? Never tried a fast normalization using half4 ?

-NiCo-
02-29-2008, 05:34 PM
You can get all the OpenGL functionality you need just by downloading the appropriate graphics card driver.

About cross-platform... I just can't use Direct3D in Linux :(

Komat
02-29-2008, 05:40 PM
If they really choose to abandon halfs, I wish they would at least add a color-renderable GL_RG32F_ARB texture format to get 64 bits per pixel (mainly to save bandwidth in PBO operations).
Korval is talking about the half type within the shader (which still has advantages on GF7 and older hw, I do not know how it is with the G80), not about the 16F format of textures/render buffers.

-NiCo-
02-29-2008, 05:55 PM
Ah, I see. Guess I was thrown off by the 'half memory cost'-quote in his post :)

noncopyable
02-29-2008, 06:52 PM
And, i was talking about the 16F textures/render buffers ><

bobvodka
02-29-2008, 10:33 PM
I think bobvodka's struggling with the english language, specifically the difference between 'needed' and 'helpful'. Yes an SDK would be helpful to novices, but no it's not needed by the main people a low-level rendering API is aimed at - i.e. people who use it to make money or to help in research projects.

"Not needed at all" was the phrase I replied to; this implies there is no requirement AT ALL for it, however the fact that it could be helpful (and indeed, one the reasons I'd recommend learning D3D over OGL right now is that having D3DX and DXUT takes away some of the pain) proves there is a need for beginners.

It ok for people who have been using 3D APIs for a while to say 'oh, you can code this in a day or two' or 'people in industry won't use it' and I'm sure it's true however everyone has to start some where, and at the point of learning the API to get an intrest not everyone can write the code in a day or two or wants to use some great abstraction over the API; they want to use the API and it's helpful to have support code there which is 'offical' and known to work properly.

Yes, a cleaner API will help and that should be a priority ofcourse but you can't sit there and write something off as 'not needed' because professionals wont use it.. there is more to the OpenGL world than professionals and people who have been using it for years and, like it or not, these people sometimes need something to help them in.

So, no, no problems with the english language here thanks.

ruysch
03-01-2008, 01:38 AM
You can get all the OpenGL functionality you need just by downloading the appropriate graphics card driver.


What driver ATI hardly supports linux, you cant be serious!
And extensions are always exposed first through the windows driver.
And Mac OS, oh do not get me started ;-)

knackered
03-01-2008, 04:01 AM
ATI hardly supports Windows GL, so what's your point?

MeneerDePeer
03-01-2008, 05:26 AM
Heh, right :) Ask someone with experience on Windows and ATi..

*jumps*

Jan
03-01-2008, 06:13 AM
That would be me and i think you are exaggerating immensely. I have had the same amount of issues with ATI as with nVidia and usually one could argue that those issues are present because of the complicated specification of OpenGL 2.1.

Let me guess, you are working all day with nVidia-hardware and at some point you run it on an ATI-card and something doesn't work. Therefore you think ATI is bad.

Well, with me it's the other way round. I work with ATI-hardware all the time and at some point run it on nVidia. And often something doesn't work (although it should). Only because that's my subjective observation it doesn't say anything about the quality of ATI's or nVidia's drivers. It just says, that when you work mostly with one vendor's hardware, you will only really notice the issues that you have with the other vendor's hardware.

It is certainly true that ATI was quite bad in the past, but they have improved very much.

Jan.

MeneerDePeer
03-01-2008, 06:26 AM
Let me guess, you are working all day with nVidia-hardware and at some point you run it on an ATI-card and something doesn't work. Therefore you think ATI is bad.

Let me debunk that for you, I've used an ATi card for one full plus a half year. Most of that time was terrible. Not only with programming, but especially in using it with 3d programs.

Before, I used nVidia, before that, I used ATi, which was a horrible time as well. Why I went with ATi again after the nVidia? Well, wanted to have a new system quickly and only ATi was left as a choice.. worst mistake I ever made.


Well, with me it's the other way round. I work with ATI-hardware all the time and at some point run it on nVidia. And often something doesn't work (although it should). Only because that's my subjective observation it doesn't say anything about the quality of ATI's or nVidia's drivers. It just says, that when you work mostly with one vendor's hardware, you will only really notice the issues that you have with the other vendor's hardware.

Unfortunately, ATi has just really been very very bad in the past, there's no way around that and I've had the full experience.


It is certainly true that ATI was quite bad in the past, but they have improved very much.

Well, I'll be honest, you're right. The latest drivers were definitely a lot better and made the whole thing a lot more enjoyable. If they continue like that, I might think about buying an ATi some day again. But for now, I know what to expect from nVidia. And even though nVidia isn't always great either, they have always given me the best combination of good hardware and good software, and there was always a way to make things run mostly well and to keep things stable (using older drivers, for example). I can't say the same for ATi.

We'll see what the future brings! I hope the GL3-spec will be a part of that future, to get back on the topic again ;)

ruysch
03-01-2008, 06:28 AM
My point is that the cross platform argument for OpenGL does not even hold for two of the largest vendors of graphics hardware, and on different OS it becomes even worse.
OpenGL has too longer slumbered and the marked is forever lost. I suspect that OpenGL 3.0 will not bring OpenGL into a position where it will be able to even slightly compete with DirectX, and this will mark the end of an era which started with SGI and now every thing is crumbling.

MeneerDePeer
03-01-2008, 06:34 AM
Well, as mentioned before, Direct3D (not DirectX, that's more than OpenGL is meant to be used for) is mostly used for games.

When you look at other apps you'll see that OpenGL is still king. 3ds Max, Maya, Cinema4D, Modo, Lightwave, Houdini, Blender, Silo, Mudbox, Softimage|XSI, Rhino, Bryce, and that's only the list of "bigger" names. The list goes on and on. Direct3D doesn't have any position in that market.

CatDog
03-01-2008, 07:40 AM
Hm. Pro/Engineer, MicroStation, AutoCAD, Inventor? (Maybe these names are not big enough?)

CatDog

MeneerDePeer
03-01-2008, 08:07 AM
Not my kind of show, but you got the point :)

ruysch
03-01-2008, 04:00 PM
The list goes on and on. Direct3D doesn't have any position in that market.

Funny thing is that the few of the products you mention which are actually being used by commercial game companies actually have waste support for DirectX, see for example Microsoft FX effect interface. 3DS Max actually uses DirectX as its base renderer.

I am not sure how serious products such as blender and softimage is in the content creation industry? Perhaps you could elaboreate on which game studios uses blender or softimage?

Also think of it, go back 5 years... OpenGL was the defacto standard! Today that position has been taken by Microsoft DirectX. With DirectX 10 Microsoft actually had the *standard* ready (and with reference device implementation) prior to the release of DirectX 10 hardware. The problem is that Microsoft has managed to direct the evolution of graphics hardware in collaboration with both NVIDIA and ATI. Sure there is Kronos and NVIDIA and ATI are still member of the ARB board; but frankly does it seem like they are devoted to a rescue plan of OpenGL?

There might be a marked for OpenGL ES on mobile devices, but in the long run I dont see a place for OpenGL simply because it evolves too slow, and because the IHVs arent focused on OpenGL at all.

The fact the Microsoft actually publicly release a *beta* SDK with software reference device implementation shows a lot about their openenes to the end-users (programmers).
They even provided some fairly good examples on its usages, and have keept providing good documentation made publicly available.

For OpenGL 3.0 we have nothing ... There is virtually no information on OpenGL 3.0; how should companies be able to prepare prehand for an opcomming standard which is largely unknown? Why this secrecy?

sqrt[-1]
03-01-2008, 05:02 PM
Not sure about blender, but softimage is used by the Valve guys (Half Life etc). (I come from a Maya house however)

ruysch - I am failing to see why you signed up to these forums. You only seem interested in convincing other people to "see the light" of your Direct3D decision.

ruysch
03-01-2008, 05:11 PM
Well DirectX is the promised land. I prefer DirectX though that doesnt mean I exclusively use it; like most others in the graphics industry I started out with OpenGL. I just fail to see why the OpenGL standard time over fails to anticipate the evolution of the industry.

They should have settled for something less than OpenGL 3.0 far ago and instead have provided a usefull standard which the industry could have embraced instead of this continuous awaiting for something which is likely to end up with nothing but a reflection of current hardware.

I personally would like OpenGL to learn from DirectX, that doesnt mean OpenGL should adobt everything as the holy grail but there certainly is a lot to be learned.

MeneerDePeer
03-01-2008, 05:16 PM
Funny thing is that the few of the products you mention which are actually being used by commercial game companies actually have waste support for DirectX, see for example Microsoft FX effect interface. 3DS Max actually uses DirectX as its base renderer.

Actually, 3ds Max has (or had at least, haven't used it for a while) support for 3 rendering back-ends: OpenGL, Heidi (software) and Direct3D. The last one was added later on and OpenGL is still default. I don't see how that makes D3D the "base renderer" for Max.


I am not sure how serious products such as blender and softimage is in the content creation industry? Perhaps you could elaboreate on which game studios uses blender or softimage?

Blender is probably getting there but has some bridges to jump over. Softimage|XSI is used by well known studio's. Blur is one that springs to my mind while typing this. Also, you're again talking about games.


Why this secrecy?

Now that's a very good question :) I'll buy anyone with an answer a beer.

ruysch
03-01-2008, 05:21 PM
Not to be rude or anything, but aint most of us in the gaming industry? The people I know in the movie industry are hardly using graphics hardware at all, mostly the use software based renderer's.

Korval
03-01-2008, 06:49 PM
I am not sure how serious products such as blender and softimage is in the content creation industry? Perhaps you could elaboreate on which game studios uses blender or softimage?

Blender is occasionally used as a previs tool, but isn't used for production in very many places. But XSI is a popular 3D modeling/animation package, right up there with Maya and 3DS Max.


Also think of it, go back 5 years... OpenGL was the defacto standard!

Huh? Since when? D3D has been more popular than OpenGL in gaming since D3D 5 or so. Certainly by D3D 8, it was the dominant gaming graphics API.

Well, on Windows.


in the long run I dont see a place for OpenGL simply because it evolves too slow, and because the IHVs arent focused on OpenGL at all.

Well, that's a leap of logic. Essentially, you're assuming the GL 3.0 delay is because IHVs have effectively stopped caring about it, rather than for some other reason. It's not an unreasonable assumption, but it is no more or less grounded than any other assumption about why GL 3.0 was delayed.


There is virtually no information on OpenGL 3.0

No, there's plenty of information on GL 3.0. What there isn't is recent info on it (more than 6 months since the last substantive update), nor is there anything close to a timetable as to when the specification would be released.


I personally would like OpenGL to learn from DirectX

Personally, I don't. I mean, there are some nice things in D3D, but there are some abstractions that are simply unnecessary. Better to start from a direction where you look at D3D, but don't necessarily do things that D3D does simply because D3D does them. That is, you build a good API by looking at what the hardware needs and what abstracts it correctly (which OpenGL pre-3.0 has never really done).

V-man
03-01-2008, 08:27 PM
I reckon then everything is just sweet candy.

Look at XNA from Microsoft, now thats a platform which is going to have long lasting consequences on the flow of programmers into the DirectX camp. Look a Game studio again from Microsoft.

Look at the content pipeline provided with XNA, but I reckon tools aint usefull....

Look at at Mircosoft PIX, or NVPerfHUD, again profiling is overrated....


I mean be honest, take a look at the *OpenGL SDK*
http://www.opengl.org/sdk/

Is that what OpenGL has to offer the next generation of graphics developers?


OK, we ge it. Microsoft is a rich company.

Timothy Farrar
03-01-2008, 10:23 PM
Sqrt[-1], v-man, korval, komat, -NiCo-, knackered, and all, thanks for all the posts ... this has been a great thread, but I think you all are falling for ruysch's troll bait. As for ruysch, grow some balls and post either using your real name, or add your real name to your profile.

However this thread is just getting too fun to skip out of posting on.

PC games, DX10 only market seems a no-go for any major publishers because Vista never caught on, and the vast majority of gamers still run XP. Also piracy seems to be driving majority of companies (US and non-MMO) out of the PC market and perhaps into consoles. For consoles, PS3 is DX9 level and 360 is not at full DX10 level. So all games, especially those intending to be profitable by porting to multiple platforms, still need a DX9 level (in terms of technology) engine. Lets keep in mind that a majority of the game industry still does not use DX. Look at the sum of PS2, PSP, PS3, Wii, and other game platforms/titles compared to XBox/360.

Now lets get back to GL. Apple is slowly catching up to having good GL support. Anyone check what features are now in 10.5.2? I believe EXT_gpu_program4 is supported on NVidia 8 series (also heard that CUDA was ported to OSX as well). How is, with 10.5.2, all the DX9 level functionality for GL on OSX with ATI cards? Given that ATI/AMD isn't adding EXT_gpu_program4 type support for HD anytime soon, DX10 level functionality on ATI cards is going to be stalled as well until after GL3. I think Apple works with the IHV's to build drivers, so probably no DX10 level support on ATI cards from Apple any time soon as well.

So lets face it, the entire industry is stalled on DX9 level features (except for niche markets).

The core of DX9 feature level (take out DX9 vertex texture fetch, because it was too slow on NVidia's 6 and 7 hardware, and not supported by ATi at the time) works rather well on GL right now! About the only really stupid thing on GL's part is not having support for pre-compiled shaders (which basically kills it for any game which has the material/shader combination explosion problem). But otherwise GL2 still currently kicks ass!

So now lets look at the short term future.

GL3 isn't going to be an important performance update for DX9 level programming. No new features, really just lower draw call overhead (hopefully), fine grain buffer locking, and better control of GPU<->CPU transfers (which we can already do well in GL2). We've got perhaps 3-6 years left on this console generation, stuck with DX9 level GPUs. So GL3+ and DX10+ makes no difference there. PC only games are becoming rarer, and Vista is not building market share fast enough, so GL3+/DX10+ isn't going to really make much of a difference there either for a while. Apple takes a while to catch up to any new GL stuff, so GL3 isn't going to be important on Mac any time soon either. You can currently use DX10 feature level on NVidia 8+9 series GPUs simply by using the GL2.1 EXTensions. So GL3 isn't important there either.

Now how about the future.

Right now there is a tremendous need to iron out a GL API which will properly transform !!!AND UNIFY!!! the industry past DX9 level features and into the highly programmable parallel GPU of the future.

How do you best do this?

Intel is getting serious about Larrabee (bought Project Offset team + lots of really smart people like Tom Forsyth, etc). Larrabee looks to be highly programmable, and doesn't match well with the current GL pipeline. Its strengths seem to require a new type of API. NVidia, the current dominate in market share for the highend, still has a bunch of features which would be highly useful for graphics which are not exposed in GL. For example shader local memory store (exposed in CUDA). And perhaps a read/write surface cache is planned (programmable ROP perhaps?) as it is in the CUDA/PTX docs, but not yet implemented in CUDA. ATI/AMD provides a 1/4 speed scatter capability (programmable ROP?) in CTM which is not exposed in GL.

So if you were the ARB what would you do? With Intel entering the market in the GL3 lifespan with a seriously divergent product how would you respond? How do you balance the strengths of all the vendors and create a core API to move forward? How do you mold GLSL into something which will work for all the vendors (especially Larrabee)?

So no news might actually really be a good thing, in that perhaps ARB is taking advantage of the current market conditions (DX9 being all that really can be used and GL2 working fine there), and fixing up GL3 for a more generalized parallel GPU of the future?

Chris Lux
03-02-2008, 12:56 AM
i am only waiting for the news that GL3 is beeing released at SIGGRAPH 2008.

thinking back, who did not think 'which year' as they said september 30th last year. ;)

D3D for me is just something to look at in a side project because for me portability over multiple operating systems and hardware platforms is much more important. in addition D3D has nothing to offer for really professional use like hardware sync or quad buffer stereo so is really never is a real option to switch but to learn from for me.

as mentioned earlier when the current GL is used with a reduced modern funtion set and current extensions it is, in my opinion, a quiet clean interface and capable of everything D3D10 has to offer. with the reduced function set i mean reducing it to stuff like FBOs, VBOs, shader objects. when using the current GL this way all i can hope for GL3 is a much more cleaner API for just this use (and everything points this way).

so i am happy with the current GL and can only to a certain point understand the bashing of if. sure the lack of the GL3 spec and any information of its current state is most annoying, but till then i can work with what i have quiet good.

Korval
03-02-2008, 02:11 AM
GL3 isn't going to be an important performance update for DX9 level programming.

Errant nonsense. It is not hard to fall off the fast path in GL 2.1. Indeed, I imagine that most users of 2.1 can't even find the true fast path, as it requires lots of testing by the end user.

Making the fast path the only path is one of the primary goals of GL 3.0.


Given that ATI/AMD isn't adding EXT_gpu_program4 type support for HD anytime soon

How do you know that? I always assumed that the general feature freeze for GL 2.1 was to promote GL 3.0 and Mt. Evans when it came out. That is, ATi took its meager GL development and focused it on preparing for GL 3.0.

And don't forget: ATi just finished a complete rewrite of their GL drivers; that basically stalled any real GL development for a good 6-12 months. And that rewrite was probably looked at in terms of making sure that the low-level code was ready for GL 3.0.


Right now there is a tremendous need to iron out a GL API which will properly transform !!!AND UNIFY!!! the industry past DX9 level features and into the highly programmable parallel GPU of the future.

Um, I don't see anything particular special about later-gen GPUs (even Larrabee) from a graphics API perspective. Oh, if you're talking about them writing an API for accessing a GPU at a low level, then maybe there's a need for a unified API there. However, this is fundamentally different from a graphics API. The needs of the two tasks are different, with precious little overlap.

The absolute most that you could expect to see is some generalization of shaders, where shaders are fairly freeform. Inputs and outputs of shader "stages" being pipeable through various different stuff and so forth. However, this does nothing for any graphics hardware that actually exists today; that hardware is DX10.1 in terms of features, and thus can't take advantage of any of that high-end pipelining, yet is still trapped in the GL 2.1 land-of-horrible-API.

The goal of the GL 3.0 graphics API was to be backwards compatible with all GL 2.0-capable hardware (obviously not BC with the API). If you start taking that off the table, there's absolutely no point in switching to an API that can't be used until hardware that won't exist for 1.5 years comes out.

ruysch
03-02-2008, 04:52 AM
Personally, I don't. I mean, there are some nice things in D3D, but there are some abstractions that are simply unnecessary. Better to start from a direction where you look at D3D, but don't necessarily do things that D3D does simply because D3D does them.

As I mentioned DirectX isnt the promised land. The trick is to learn the good parts of DirectX and perhaps ask why is it that DirectX has catch on so rapidly...


I think precompiled binary shaders is a good example of something which is practical and extremly usefull in real-applications.
The fact that GL hasnt adoped this to me indicates that the standard does not take end users serious.

Another lesson to learn from DirectX is rapid API evolution, the hardware evolves rapidly, and the standard should have seen even five years ago that the trend would be more and more programmable, and we are likely not done with that department yet. Having the ability to combine vertex, geoemtry and fragment shaders in a single effect is a nice feature; being able to set states inside the effect is very usefull. Decoupling the sampler from texture, thats a nice touch, again why cant we get simple stuff like that, simple and practical usefull!

Why hasnt there been an effort to make an add on library to OpenGL which takes some of the nicer things from D3DX. For such a library to be effective some of the IHV's could perhaps back up the library. It could be a thin-library which didnt depend on anything but what OpenGL exposed. The library should as D3DX not aim at replacing internal tools used by developers, but still having a SSE2, or SSE4 (depending on your compile target) optimized math libary is nice too have...

But then again, the general feeling seems like OpenGL as a standard is on the right way. I only hope you are correct.
For me DirectX and OpenGL are just graphics API's, I would like to have em both, but I fear the time where there is only one graphics API.

-NiCo-
03-02-2008, 05:35 AM
As I mentioned DirectX isnt the promised land. The trick is to learn the good parts of DirectX and perhaps ask why is it that DirectX has catch on so rapidly...


Actually, you said



Well DirectX is the promised land. I prefer DirectX though that doesnt mean I exclusively use it; like most others in the graphics industry I started out with OpenGL. I just fail to see why the OpenGL standard time over fails to anticipate the evolution of the industry.

:)



I think precompiled binary shaders is a good example of something which is practical and extremly usefull in real-applications.


Agreed.



Another lesson to learn from DirectX is rapid API evolution, the hardware evolves rapidly, and the standard should have seen even five years ago that the trend would be more and more programmable, and we are likely not done with that department yet.

Although rapid API evolution is a good thing, it can clutter the API specification. Off course, it shouldn't be too slow either *hint-to-ARB* :)



Having the ability to combine vertex, geoemtry and fragment shaders in a single effect is a nice feature; being able to set states inside the effect is very usefull.


Agreed.


Why hasnt there been an effort to make an add on library to OpenGL which takes some of the nicer things from D3DX. For such a library to be effective some of the IHV's could perhaps back up the library. It could be a thin-library which didnt depend on anything but what OpenGL exposed. The library should as D3DX not aim at replacing internal tools used by developers, but still having a SSE2, or SSE4 (depending on your compile target) optimized math libary is nice too have...


Although a library like that would be a good thing, I'm fairly happy with the combination of libraries I use now (matrix libraries, image libraries, boost library,...). So maybe a list of related existig libraries would already be a good idea :)

Jan
03-02-2008, 05:35 AM
The philosophy of gl is to only provide the core API, not helper APIs (glu is the only exception). Most of the things you mention are not core-features in D3D either, but MS has the philosophy that they should provide as many helping tools as possible. That is their strength, but it is really only possible, when you are limited to one platform and to one genre. MS doesn't provide a multi-platform API and they are focused on games, because that's what D3D is meant for.

OpenGL is meant for scientific simulations, CAD, etc. and some people actually use it for games too. The needs of all these groups are so vastly different, that you cannot provide tools that would be useful for many people. So it makes more sense to let people create their tools themselves. For the few people who make games it looks like OpenGL would have bad tools-support, but that is only compared with D3D, which is focused on games for windows.

When you actually create a professional game-engine with D3D, you won't use most of the things that MS provides and build your own tools anyway. So, in the end there is not a big difference, even with games. But simply looking at the two APIs and comparing their differences will give you a wrong impression, because OpenGL is not solely intended for games and thus has other design and support philosophies.

On the other hand, OpenGL is so old right now, that, as mentioned above, it is only suitable to do D3D9-stuff and nothing beyond that. D3D had a rewrite every few years and it was amazing how long it took until it actually reached OpenGL's quality. The problem is, today nobody wants to start writing a renderer with an API that is clearly dated and not future-proof anymore. If i needed to do a graphics-project right now with the current features, i would use OpenGL 2.1, no problem. But what i want to do is to write a renderer, that i design and write today and use in the next 3-5 years only extending it piece by piece.
To do that, i need a clean API. That is why i am impatiently waiting for OpenGL 3.0. My programs are currently all stalled at the graphics-front, because i don't want to waste my time at writing a 2.1 renderer (and REWRITING everything later).

Every D3D rewrite took a few years with a team of people, sitting in one location, working on the new API all day. I wonder which super-powers the ARB-people have, that they thought they would do a complete rewrite in just a few months.

Jan.

Ysaneya
03-02-2008, 06:13 AM
The philosophy of gl is to only provide the core API, not helper APIs (glu is the only exception).

It makes me smile when arguments start to degenerate into the "philosophy" of OpenGL :) It's used a bit like a shield you raise each time you don't want to answer about an important missing functionality. We refuse to hear anything about querying video memory usage, but meanwhile, it's possible to query the amount of temporary registers or max instructions in shaders, for example.. oh the irony.


OpenGL is meant for scientific simulations, CAD, etc. and some people actually use it for games too. The needs of all these groups are so vastly different, that you cannot provide tools that would be useful for many people.

We have an extensions mechanism that works under many OS. Why not an external library using the same idea ?

In the worst case, we could release a "D3DX-like" library for Windows only and ignore users of other O.S. Yeah, that's not cool for them, but it's at least an improvement for the majority of users, and especially beginners under Windows. It's better than saying "if we can't do it for all OS, let's not to it at all" IMO. Remember that today's beginners might be tomorrow's professional, so anything that can make OpenGL more accessible to them is good for OpenGL's future.

Y.

ruysch
03-02-2008, 06:35 AM
OpenGL is meant for scientific simulations ...

There has actually be quite a shift in favorite graphics API in the researcher community, take a look at a lot of the work being done at SIGGRAPH for example, more and more wellknown researchers have taken on DirectX. Which in it self might not be significant, but the problem is that they are likely replace OpenGL related courses at Universities with DirectX as the primary platform.

There is a lot of serious signs that OpenGLs influence is seriously diminishing. But then again alot of the forum users seem to be very happy with how things are. So I will refrain from mingle in the debate. The debate has been here before, back then the same religions speeches was given. Microsoft is evil, OpenGL is multi platform, (is it really ? I mean can I use OpenGL on any of the consoles.... or should I use a Sony approved OpenGL ES?)
Damn this debate is seriously in the religious lane!

MeneerDePeer
03-02-2008, 07:42 AM
Not to be rude or anything, but aint most of us in the gaming industry? The people I know in the movie industry are hardly using graphics hardware at all, mostly the use software based renderer's.

Well, I'm certainly not in the gaming industry, and there's a whole lot to do before you can actually start using that software based renderer to spit out sexy renderings.

Think about all the modellers and sculpters who use graphics hardware to the bone to display their high resolution meshes, or digital painters, or animators who create storyboard/previz material first using the preview frames in their viewports and then slowly replace them with the final software based renderings.

Without graphics hardware, artists would not be able to make all those great looking end results.

pudman
03-02-2008, 09:48 AM
Microsoft is evil, OpenGL is multi platform, (is it really ? I mean can I use OpenGL on any of the consoles.... or should I use a Sony approved OpenGL ES?)

First, no one has been saying that MS is evil, only that they have lots of resources. And it is in their interest to use those resources for their own platform.

Second, when people generally talk about multi platform they don't mean Windows and consoles. They mean Windows, Linux, Mac, Solaris, SGI, BSD, mobile devices, and yes, also consles. In the professional world these other platforms are very significant. I personally use Solaris daily.

Do you even know what APIs are available for the PS3? There's libgcm, Sony's proprietary API written for their hardware. There's also PSGL (Playstation OpenGL, written on top of libgcm) that they added support for for more portability. ( Reference (http://www.inalogic.com/post/choosing-your-graphics-api-for-the-ps3/) )

Why didn't Sony write a D3D layer? Why don't mobile devices use a D3D "ES" API?

If Kronos puts the amount of effort into GL3 such that it becomes as durable as GL1.x-GL2.x then that would be a really Good Thing. Unlike DX, there wouldn't be a need to rewrite it every few years.

ruysch
03-02-2008, 10:38 AM
For mobile devices you are to use OpenGL ES... Cant find a mobile device which actually supports say OpenGL 2.1 sure you can use ES but thats an entire different API.

My point on consoles is that there is no standard but what the vendors create. Sony has theirs, Nintendo has theirs and Microsoft is theirs.. so again I cant see which console people are referring to when the say that OpenGL is multi platform.

And do not give me that years old record on multi OS, because again and again that failles in practice.

Again you fail to understand that DirectX was not developed by Microsoft alone. Like HLSL and Cg it was a joined effort between the IHV's, AMD, ATI, NVIDIA and Intel. Which are core members of Kronos, they wanted DirectX first! I actually fail to see the successes of Kronos... Because clearly the core members are not devoted to OpenGL but to DirectX.

Lindley
03-02-2008, 10:54 AM
First, no one has been saying that MS is evil

Yeah, were were just giving each other significant glances and laughing incessantly.

V-man
03-02-2008, 11:24 AM
My point on consoles is that there is no standard but what the vendors create. Sony has theirs, Nintendo has theirs and Microsoft is theirs.. so again I cant see which console people are referring to when the say that OpenGL is multi platform.

That's nice.

The question is : what do you want from GL or Khronos?
1. You want it to be multiplatform?
Anyone can implement it on their platform. It is not Khronos's fault if it is not available on XBox.
There is nothing to improve in this regard.

2. D3D is being used by Univesity researchers.
It is not GL's fault. This is just a end user's choice.

3. You want a SDK, utilities and etc.
Forget about it. There are various libs out there and SDK from NV and ATI. Perhaps in your eyes, it will never be as good as what MS offers.

and finally, you mention Linux and OSX are not significant.
Thus, you are a Windows only guy. Why the heck would anyone not use DX in this case?
Maybe you simply like the GL API?

ruysch
03-02-2008, 12:29 PM
Maybe you simply like the GL API?


Yes, there are many parts of the OpenGL API that I not only like but love, I like the clean way you can handle the most basic stuff. I like the frame buffer object extension.

There are parts of GLSL I like, I personally hope they will allow binary shaders and state handling inside a common block. And that they will have a precompiler ready for 3.0

What I mean with multi platform is that in the end its up to the vendors if they are willing to sacrifice driver developers for your particular platform (i.e see how ATI has thrown in the towel
and open-sourced their driver development for linux) or look at apples lack of progress in implementing extensions which the hardware actually is capable of doing. Again and again OpenGL is a the mercy of the IHVs.

Clearly both NVIDIA and ATI has SDK's but then again it has also become apparent that both companies are devoting their time for DirectX. My point is OpenGL lacks a company devoted to the survival of the standard. Kronos isnt enough because the major players in Kronos is exactly NVIDIA and ATI an they do not seem to focus on OpenGL.

-NiCo-
03-02-2008, 12:38 PM
Kronos isnt enough because the major players in Kronos is exactly NVIDIA and ATI an they do not seem to focus on OpenGL.

I don't agree (for Nvidia anyways). Their Cg 2.0 Toolkit/SDK already supports the latest features (gp4vp,gp4gp,gp4fp) in OpenGL while the Cg Direct3D interface is stuck at DirectX 9c. Same is true for CUDA interoperability. And they still provide nice OpenGL SDKS although they might not be as elaborate as the Direct3D ones.

ruysch
03-02-2008, 12:53 PM
Now the trick with Cg and HLSL is that although NVIDIA and Microsoft joined forces in developing the first shader languages for graphics hardware; both companies are in the language industry ;-) But you are right NVIDIA is still trying to promote Cg as the standard shader language.

And NVIDIA can hardly compete with HLSL as the native language when using DirectX, on the other hand they can more than compete with GLSL. Which also explains the occasional driver flukes, because they certainly have a interest in bring Cg into the OpenGL standard.



But lets bury the DirectX comparison.

Has anyone heard anything at all when we are to expect OpenGL 3.0? Whats causing the delay?

Timothy Farrar
03-02-2008, 12:54 PM
Errant nonsense. It is not hard to fall off the fast path in GL 2.1. Indeed, I imagine that most users of 2.1 can't even find the true fast path, as it requires lots of testing by the end user.

The fast path is always going to be hardware specific and something which requires either good vendor documentation or simple profiling by the programmer. How is GL3 going to aid the programmer in knowing? GL3 going to have an API which you can probe to know all the differences in hardware? That would simply be a nightmare of complexity.

The core stuff needed for speed is currently well documented for GL2. How much faster do you actually think a GL3 app will be compared to a GL2 app on the same hardware?

About the only case I can see GL3 kicking some arse is if your application is draw call bound on the driver (actually on the driver side, and not on your side feeding the draw calls). And if you are driver side draw call bound, then you might want to re-engineer, optimize, and factor out the draw calls away anyway.


Making the fast path the only path is one of the primary goals of GL 3.0.

Agreed. And GL3 hopefully will finally enable some IHVs to actually get fully functional and DX matching drivers out when they release hardware.



Given that ATI/AMD isn't adding EXT_gpu_program4 type support for HD anytime soon

How do you know that? I always assumed that the general feature freeze for GL 2.1 was to promote GL 3.0 and Mt. Evans when it came out. That is, ATi took its meager GL development and focused it on preparing for GL 3.0.

And don't forget: ATi just finished a complete rewrite of their GL drivers; that basically stalled any real GL development for a good 6-12 months. And that rewrite was probably looked at in terms of making sure that the low-level code was ready for GL 3.0.

I hope you are right about that!

But that being said, all the hooks needed for GL2.1 and most of EXT_gpu_program4 should already be in the DX10 / DX10.1 drivers. Wouldn't be too tough to build a mostly functioning GL2.1 + EXT_gpu_program4 wrapper around the DX10 interface. Take NVidia's drivers for example, aren't they like 95% shared code base between all platforms (DX and GL on multiple OSs). Why blow out 6-12 months writing a separate not fully functional new GL driver when you could just do a fully functional wrapper around a shared already fully working DX10 code base.

Even with GL3 done, I'm guessing it is going to be a long time before Evans, and I would still bet ATI/AMD doesn't add support for GL2.1 EXT_gpu_program4 in the mean time. Honestly I would love, really love to be wrong on this.


Um, I don't see anything particular special about later-gen GPUs (even Larrabee) from a graphics API perspective. Oh, if you're talking about them writing an API for accessing a GPU at a low level, then maybe there's a need for a unified API there. However, this is fundamentally different from a graphics API. The needs of the two tasks are different, with precious little overlap.

Not yet, just wait, there will be much overlap in the future. I can give many many examples if need be.


The goal of the GL 3.0 graphics API was to be backwards compatible with all GL 2.0-capable hardware (obviously not BC with the API). If you start taking that off the table, there's absolutely no point in switching to an API that can't be used until hardware that won't exist for 1.5 years comes out.

Very important point here. Could be part of the problem be given that GL3 was also trying to be "future proof" and the future seems to look quite different than the older hardware!

pudman
03-02-2008, 01:16 PM
For mobile devices you are to use OpenGL ES... Cant find a mobile device which actually supports say OpenGL 2.1 sure you can use ES but thats an entire different API.

I believe "entire" is quite an overstatement. OpenGL ES *is* OpenGL but stripped down to bare necessities implementable on current mobile devices.


My point on consoles is that there is no standard but what the vendors create.

I believe that's the antithesis of "standard". I didn't interpret that as "your point". You stated that OpenGL was not really multi platform because it wasn't supported on all consoles. Even for gaming GL is more multi platform than D3D... PC, OSX, Linux, ~PS3. If you include professional applications it's an even wider market. That was *my* point.

It remains to be seen how much support there will be behind GL3 when the spec is finally out. Right now it's pure conjecture.


Again you fail to understand that DirectX was not developed by Microsoft alone.

I understand that completely, don't worry. Without MS there would be no D3D. While SGI did create GL, "Open" GL exists just fine without them. However, the ARB and now Kronos don't have the resources that MS does to throw at tools and developer promotions.


My point is OpenGL lacks a company devoted to the survival of the standard.

You are absolutely correct. Having a well-known company (or arguably companies) "dependent" on GL would be a huge benefit. Dependency is a great motivator.

ruysch
03-02-2008, 01:31 PM
You are absolutely correct. Having a well-known company (or arguably companies) "dependent" on GL would be a huge benefit. Dependency is a great motivator.

Then we agree ;-) The lack of a major player involved/devoted to OpenGL is becoming more and more apparent at least for me.



Even for gaming GL is more multi platform than D3D... PC, OSX, Linux, ~PS3


OpenGL running on PS3 is just not correct its a Sony owned version of OpenGL ES. And with the lack of good drivers you can count out OSX and Linux depending on what IHV your are trusting not to throw a null pointer at your attempt to unlock geometry shader 4.0 ;-) add consistent quirks like when ATI drivers forced into software when writting to gl_ClipVertex; I am simply amazed of the lack of driver conformance, and the IHVs get away with such sloppiness.

By the way OpenGL ES isnt the only player on the mobile device market, NVIDIA has actually chosen that the new APX 2500 is targeted for Windows Mobile only. It will have support for OpenGL but it is likely that the preferred graphics API will be D3D mobile, time will tell.

Its a standard so in theory your list is correct, but not in real life. Take for example texture proxies which as far back as I remeber have not worked.
http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=234665#Post234665

Korval
03-02-2008, 02:57 PM
Why hasnt there been an effort to make an add on library to OpenGL which takes some of the nicer things from D3DX.

And who precisely is going to write and support it? You?

The ARB pre-Kronos had precisely enough resources to write specifications and PDFs. The Kronos ARB has slightly more resources, but look at the "OpenGL SDK". Everyone knows what an SDK is supposed to be, yet they released nothing of the kind. Why? Because they don't have the resources to write, debug, and maintain anything like D3DX.


It makes me smile when arguments start to degenerate into the "philosophy" of OpenGL \:\) It's used a bit like a shield you raise each time you don't want to answer about an important missing functionality.

It's not a shield, it's a prioritization. It represents an understanding that a particular process of design has a goal to it, and the achievement of that goal is to be served by that process. So when you look at feature X, if it doesn't fit with the design goals, you don't include it.


We refuse to hear anything about querying video memory usage, but meanwhile, it's possible to query the amount of temporary registers or max instructions in shaders, for example..

Actually, max instructions and temporary register counts aren't querriable in glslang, only in the ARB_*_program extensions.

Oh, and what happens to your precious "video memory query" when full-on virtualization comes along? Should the driver report that the hardware only has 64MB of memory, or the size of your main memory? That "philosophy" exists for a reason.


But then again alot of the forum users seem to be very happy with how things are.

Um, since when? Nobody on this forum is "very happy" with GL 2.1 as it stands. We all want GL 3.0. What we don't necessarily want is GL 3.0 == D3D, which is what you seem to be arguing for.


The fast path is always going to be hardware specific and something which requires either good vendor documentation or simple profiling by the programmer.

That's a half-truth.

Yes, the fast path will always be hardware specific. But the API does nothing to communicate hardware limitations to the user. What to use normalized unsigned shorts in your vertex data? The API says fine whether the hardware can do it or not. And if it can't, then the driver now much read your VBO (slow), rearrange your data, and write it to another VBO that it then uses to render from. And it does this every time you draw.

That's one of the things GL 3.0 promises to deal with. When you create a VAO, you tell it what your vertex format will look like. If the hardware doesn't like it, it will simply fail to create the VAO. Simple.

You may not be using the fastest possible path for your data (using floats when you could live with unsigned shorts or something), but you won't fall into the "driver must emulate functionality X for this hardware" path. There is better communication between the user and the hardware through the API.


Wouldn't be too tough to build a mostly functioning GL2.1 + EXT_gpu_program4 wrapper around the DX10 interface.

You might have noticed that OpenGL uses a different shading language from D3D. That means that implementing EXT_gpu_program4 would require changing the compiler, linker, and optimizer. This is a lot of work.

It isn't a lot of work for nVidia, because they just compile glslang to Cg, which by being a slight spinoff of D3D-HLSL, already has these features. They can leverage their D3D-HLSL compiler. ATi can't.


The lack of a major player involved/devoted to OpenGL is becoming more and more apparent at least for me.

Have we forgotten Apple? Oh, and therefore Blizzard.

Blizzard, for reasons that continue to perplex me, loves making Mac games. Every Blizzard game has a Mac version, and they will continue to do so.

Apple has precisely one interface for 3D graphics: OpenGL. That's it. If OpenGL doesn't work, World of Warcraft doesn't work. And that's not acceptable to Blizzard or Apple.

OpenGL is built into MacOSX itself; it relies upon it for its rendering needs (though it is a bit more complex than that).

In short, Apple needs OpenGL in order to have 3D applications. Blizzard needs OpenGL to work in order to make 3D games on the Mac.

Apple doesn't care about getting an OpenGL SDK; that's not important to them in any way, because it doesn't help solve their problems. What Apple wants is an OpenGL that is more performant, easier to write drivers for (though they solved much of this problem in their OS implementation), and so on. Apple is dependent on OpenGL.

The problem isn't that someone isn't dependent on it; the problem is that someone can't dictate the progress of OpenGL. Microsoft may have collaborated with nVidia on D3D-HLSL, but they had final say. The language was as they wanted it to be. The D3D API may have been developed with input from IHVs, but Microsoft was ultimately in charge of it.

The fact that everything requires a vote and so forth with the ARB is probably the biggest limiting factor. If two people are having a good faith disagreement on a point, the deadlock can't easily be broken.


Take for example texture proxies which as far back as I remeber have not worked.

I think you're misunderstanding people. It isn't that everone thinks everything's AOK with OpenGL. It's that they don't think that the solution is to give up or copy D3D.

pudman
03-02-2008, 02:59 PM
You are absolutely correct. Having a well-known company (or arguably companies) "dependent" on GL would be a huge benefit. Dependency is a great motivator.

Then we agree ;-) The lack of a major player involved/devoted to OpenGL is becoming more and more apparent at least for me.

Some would argue that OpenGL would have had no place in gaming had it not been for John Carmack's vocal support for it back around '96. It would definitely be a boost for GL for something like that to happen now. Who knows, maybe when 3.0 comes out there will be a surge of support.


OpenGL running on PS3 is just not correct its a Sony owned version of OpenGL ES.

Every GL implementation is a proprietary implementation, technically.


And with the lack of good drivers you can count out OSX and Linux...

You are really just picking hairs. If GL didn't work on any platforms, sure it wouldn't be considered multi platform. The facts are against you however. World of Warcraft runs quite fine on OSX, for example. And you'd be hard pressed to tell other coders in this forum that work on Linux that they aren't technically using OpenGL because the drivers aren't as good as Window's drivers.

Lack of perfect support does not make GL less multi platform. You could argue it's usefulness on certain platforms but the fact remains that if you do 3D on a platform other than an MS platform, chances are that you're using GL. I'm not sure how else you'd define multi platform support.

Jan
03-02-2008, 03:00 PM
I have just written an app for linux, but since i am a windows-guy i did it on windows (with Qt). "Porting" it to linux was really only compiling it for linux. There were zero issues, that i had with the OpenGL drivers. Neither on nVidia nor on ATI, that i didn't have on windows too.

So, in MY opinion, that counts as multi-platform. With D3D that wouldn't have been possible.

But in MY opinion you fail to see those arguments anyway. YOU argue, that for an API to be multi-platform it needs to
a) be supported on ANY platform
b) work without ANY differences, equally on ALL platforms

a) is not the decision of the ARB. The owner of the platforms decide whether they want to support OpenGL, therefore there is no OpenGL for the xBox.
b) is not possible, since every implementation is done by different vendors, introducing different errors and having different performance characteristics for their hardware. But D3D is no different in this point.

But i guess this discussion is pointless anyway.
Jan.

EvilOne
03-03-2008, 07:25 AM
Hmmm, whats planned for GL3 doesn't make me really happy. My problem here is the "error reporting on create". I'd prefer a "query what works" before creating any resource and a simple "fail on create" when creating the resource. I see an explosion in error return codes...





// TODO: Make this a recursive function... lol
GLerror error = glCreateStuff();
if(error != GL_NO_ERROR)
{
AdjustParameters(error);
error = glCreateStuff();
if(error != GL_NO_ERROR)
{
AdjustParameters(error);
error = glCreateStuff();
if(error != GL_NO_ERROR)
{
AdjustParameters(error);
error = glCreateStuff();
if(error != GL_NO_ERROR)
{
// You get the point... hehehe :-D
}
}
}
}


It's true, not really a very important point, but somehow annyoing. And it makes writing buffer factory classes impossible.

EvilOne
03-03-2008, 07:59 AM
As a funny sidenote, this board is a proof that GL2.0 + FBO is in no way a replacement for DX9. 90% of the threads are about driver errors, unpredicted behaviour, non-standard behaviour, strange results without error codes, incompatiblities, crashes, etc.

I hearby demand a name change from "OpenGL Coding: Advanced" to "OpenGL Coding: Guess what the error is! A forum for the whole family!"

My sense of humor is a bit crappy today.

To be true, it would be really nice if we can get some progress update, that is not the common "we are working on something", look at our nice presentations... But a simple, why the delay, whats the current state, what are we working on.

knackered
03-03-2008, 10:53 AM
And it makes writing buffer factory classes impossible.
How so? If a format's not supported, then it's not supported. Makes no difference whether you called glCreateStuff() or glIsThisPossible().
You're being a picky sod, and frankly it's getting annoying.

EvilOne
03-03-2008, 12:19 PM
Yeah, thats the problem with public forums. Simply, click on my user name, click on "view profile". An user page shows up. Click "Ignore this user". Your problem is solved. Wow... that gives some ARB feeling. "Ingore those users". Why not shove your opinion about other people right up your ass?

So this implies to me that we can't even query texture sizes before trying to create the resource? And then we start permutating widths and heights? Or some half-baked model we have currently, some stuff we can query, some not? Then I prospose the following model: All texture widths are available by a query, all heigts by create and error. Same with vertex element types, unsigned types by query, signed types by create and error. Floating point texture formats by query, integer types... etc.

ruysch
03-03-2008, 12:48 PM
Well the query mechanism as I have heard about sounds a lot like the way D3D works. Could you post a link to where you have your information from?

EvilOne
03-03-2008, 01:00 PM
That's what I've read from the few pipeline newsletters and some posts in the boards.

knackered
03-03-2008, 02:05 PM
Yeah, thats the problem with public forums. Simply, click on my user name, click on "view profile". An user page shows up. Click "Ignore this user". Your problem is solved. Wow... that gives some ARB feeling. "Ingore those users". Why not shove your opinion about other people right up your ass?
Are you feeling ok?

Lindley
03-03-2008, 02:07 PM
If what you want to do won't work, you can't do it. It's really that simple. Query mechanisms won't help a whole lot to overcome this, at least as far as I can see. *IF* you were trying to find some working combination, you'd have to enumerate all possibilities until something valid popped through either way. It's not like it's a linear programming problem.....

Unless I'm missing something.

Stephen A
03-03-2008, 03:25 PM
Lindley, imagine the following hypothetical scenario: you have a 4096x4096 bitmap that you want to upload as a texture. Depending on max texture size, you'll need to downsample it, or split it up into different objects. How would you do this with and without queries?

A trivial example, but it highlights the difference in the approaches. If you can query max texture size:


int max = GL.GetInteger(GetPName.MaxTextureSize);
if (max < bmp.Size)
bmp.Resize(max, max);
GL.TexImage2D(texture, PixelFormat.Bgra, PixelInternalFormat.Four, max, max, bmp.RawData);
if (error = GL.GetError() != 0)
...


If you cannot:


do
{
GL.TexImage2D(texture, PixelFormat.Bgra, PixelInternalFormat.Four, bmp.Width, bmp.Height, bmp.RawData);
if (error = GL.GetError() != 0)
if (error == ErrorCode.TextureSizeExceeded)
bmp.Resize(bmp.Width >> 1, bmp.Height >> 1);
else
...
}
while (error != 0);

The latter is less performant, as you have to blindly resize the bitmap until it fits; and since you can't query maximum size beforehand, there is no way around that.

It is not linear programming either way. However, queries allow you to separate the setup from the renderpath (i.e. check capabilities and choose the best available renderpath). Without queries, you have little choice but litter the actual renderpath with blind trials. You may argue that you can execute the trials beforehand, but in this case you are just emulating queries, albeit less efficiently!

Anyway, this topic has been discussed extensively in the other GL3 thread. Back to waiting, I guess.

Jan
03-03-2008, 03:57 PM
I would say, that max texture sizes are parameters that will be queryable. HOWEVER, only because you do fulfill the restriction on the size, it might still fail (as you recognized in your first example, too).

It makes no sense to have an API that does not allow to query ANY values, as your example shows well.

However, in OpenGL 2.1 the spec implies, that when you fulfill the queryable restrictions, it will work (which is in reality not the case). OpenGL 3.0 simply changes this, that you will most certainly still be able to query some max values, but the meaning of those is simply, that when you DO NOT fulfill the restrictions, it will DEFINITELY fail. When you DO fulfill the restrictions, it is still up to the creation-call to decide, whether the actual combination is possible.

This is an important contrast to 2.1, where after you made sure to fulfill the restrictions, there is no mechanism for the driver to reject your chosen combination (only FBOs are able to return "unsupported"). So the drivers TRY to allow all valid configurations, which makes it often difficult to hit the fast-path.

There is one ring to rule them all, but there is not one mechanism to fulfill every bodies needs. The ARB might be slow, but they are not stupid. There will always be min/max values to query, but with 3.0 the driver will have a way to communicate with the application, instead of silently trying to obey its master for better or worse.

Jan.

Brolingstanz
03-03-2008, 04:20 PM
To be true, it would be really nice if we can get some progress update, that is not the common "we are working on something",
look at our nice presentations... But a simple, why the delay, whats the current state, what are we working on.


The thing is they can't just lean in here and whisper in our ears... they'd have to announce it to the world, and in so doing they'd have to make it official, with all the pomp and circumstance befitting an announcement from Khronos. As such I suspect they'd like to wait until they have something substantial to say, in preference to a spring throat clearing.

Anyhoo, it wouldn't be a pub-like nudge-nudge, wink-wink, say no more, he said knowingly kind of thing.

Zengar
03-03-2008, 04:43 PM
About the query vs. error-on-creation issue:

When you develop an application, you have general idea what the hardware you target is capable of. So you can have different rendering paths, ending with code like

[CODE]
if not VeryAdvancedPath then
if not AdvancedPath then
if not Simplepath then Exception("Get a new card!")
[\code]

In my opinion, this is very convenient. With a query mechanism, you will need to "duplicate" your code (a query part and a setup part), making it more complicated. The idea is, that a single application will never use lot of different feature combinations. It is common sence that each card that, say, supports floating-point texture blending also supports the so called "SM2.0" shaders: the point is, the combination of features are not arbitrary, but rather very well defined by evolution of graphics cards. So the "trial-and-error" structure of GL3 is very practical. Also, there are features that can hardly be queried, like special combinations of shader instructions.

Hampel
03-04-2008, 01:42 AM
But I think there should be a very simple query mechanism: this mechanism should return true, iff for a given configuration the object creation will succeed. That is, for every
lpCreate<Object>(arg0, arg1, ...) function there should be a
lpIsValidConfigFor<Object>(arg0, arg1, ...) function which checks only the validity but does not create the object.

tanzanite
03-04-2008, 05:43 AM
That is, for every function there should be a function which checks only the validity but does not create the object. No i think. Valid != can do it whenever you need it.

Counterexample of events:
=> ask if A obj can be created.
<= yes, there is enough (address) space (or whatever finite resource).
** some OTHER app uses or clutters that space somehow.
=> make obj A
<= ... oh, [censored] - i lied, can't do it.

as i see it, GL3 plan is sound. No?

knackered
03-04-2008, 11:08 AM
The latter is less performant, as you have to blindly resize the bitmap until it fits; and since you can't query maximum size beforehand, there is no way around that.
Well no, that's not true. GL3 separates out the parameters from the content of an object, so you'll just be trying to create a container object capable of what you require rather than submitting and re-submitting data as part of the creation process. So your simple example is not proving any point at all.

Korval
03-04-2008, 11:12 AM
The latter is less performant, as you have to blindly resize the bitmap until it fits

Technically, that's not true. For images.

GL 3.0, at the last info dump, had ImageFormat objects which, as part of their parameters, took a range of image resolutions. You can't create a texture (technically an Image) without an ImageFormat, and it is illegal to specify resolutions for that Image that are outside the range you asked for with the ImageFormat.

So rather than sitting and spinning on every texture create to find the largest resolution, you would only need to sit and spin on ImageFormat creation. And ImageFormats are intended to be widely shared.


But I think there should be a very simple query mechanism: this mechanism should return true, iff for a given configuration the object creation will succeed.

What would be the point?

I mean, you went through the trouble of setting up an Template object with all of the parameters. The fact that you want to query whether this object can be created means that you will want to create it. Probably sometime in the near future. Would it kill you to just create it now and hang on to it?

Oh, and let's not forget the ImageFormat stuff.

Stephen A
03-04-2008, 01:58 PM
So rather than sitting and spinning on every texture create to find the largest resolution, you would only need to sit and spin on ImageFormat creation.
So technically ImageFormat can be thought as a kind of a "formalized" query. Feels like a sound approach.

Does this come from a newsletter? Or if not, do you have a link? I'd like to see what guarantees ImageFormat would convey. Ideally, if you have an ImageFormat the texture creation would never fail, unless there is an out-of-memory error (virtualized memory to the rescue?), or the user hot-plugs the video card. Also, do we know if this approach will be followed for all OpenGL resources were it makes sense? (e.g. vertex formats)

Hopefully GL3 will be able to avoid the mix of approaches in GL2, where different resources can be queried (texture size), tested with proxy objects (texture formats and size), can return unsupported errors (FBO), or even fail without any way you could test beforehand (GLSL, plug and pray). :)

On another note, I silently admire Khronos. I didn't follow FBO or GL2 development back then, but I find it rather impressive that there have been almost no leaks for 6 months now. Well done guys!

Zengar
03-04-2008, 02:19 PM
There were never leaks )) Very few insiders and too little people interested.

Korval
03-04-2008, 04:00 PM
Does this come from a newsletter?

Essentially, yes. The stuff about what an Format does was covered in a Newsletter (specifically, here (http://www.opengl.org/pipeline/article/vol004_4/)). I specifically asked one of the ARB guys if the creation of an Format (or other objects) that couldn't be hardware accelerated would fail, and the answer was yes. So if you ask for a FLOAT_32 format and you want it to be bilinearly filtered, then the creation of the format would fail.

It was in the SIGGRAPH info where they specified that Format objects would contain usage information. Not hints (which are ignorable), but usage info. That is, if you create a Format and you don't say that it needs to be bilinearly filterable, it is an error to use an Image created from that format as a texture.

Indeed, you can't use an Image created from that Format as a texture (ie: bind it to a glslang sampler) unless you actually say that you want to be able to. That way, the concept of "RenderBuffer" goes away entirely; it's simply a matter of the Format that you use. If you want images created from a Format to be bound as both Textures and Render targets, you can, but you have to say so in the Format.


Also, do we know if this approach will be followed for all OpenGL resources were it makes sense?

The Vertex Array Object structure is similar to Formats, but different. The VAO contains all of the binding points for the vertex attributes for a single draw call. So it can verify if the data type, size of data, stride, etc. are valid for a complete set of array data. The only things that vary (ie, is not checked at creation time) are the buffer objects themselves and the offset into those buffers.

Program objects take care of themselves (either the program works or it doesn't build an object). FrameBuffer objects have as a requirement for creation the Format for each attachment point, so they can verify at creation time whether Images created from those Formats can be attached.

You should read up on the SIGGRAPH information. It's all pretty sweet. Which makes the fact that it's been 6 months without new information since then even more infuriating...

Hampel
03-05-2008, 02:20 AM
But I think there should be a very simple query mechanism: this mechanism should return true, iff for a given configuration the object creation will succeed.

What would be the point?

I mean, you went through the trouble of setting up an Template object with all of the parameters. The fact that you want to query whether this object can be created means that you will want to create it. Probably sometime in the near future. Would it kill you to just create it now and hang on to it?

Think of some combo boxes for end-users for tweaking render parameters, for example, which should only contain valid values or valid combinations of values; no need to create the corresponding GL objects for that!

Korval
03-05-2008, 11:27 AM
Think of some combo boxes for end-users for tweaking render parameters, for example, which should only contain valid values or valid combinations of values; no need to create the corresponding GL objects for that!

Then delete them afterwards. That isn't performance-critical code.

Stephen A
03-05-2008, 04:15 PM
Thanks for the detailed explanation, Korval. I'm going through the available material know, and it's pretty sweet indeed.

Let's hope for a 2008 release...

k_szczech
03-06-2008, 11:26 AM
I turn my eyes for a few days and whoooa!
This topic continues to grow covering more and more areas. We've allready been through all our standard off-topic topics: Microsoft, D3D vs OpenGL, vertex arrays, new object model in upcoming GL3 (if you look at the date of newsletter where it was presented then it's not that new afterall)...

What can I say? Maybe I'll just stick to the original topic:

"Are we there yet?" :P

And untill we're there don't expect me to have anything more to say...

knackered
03-06-2008, 12:11 PM
I think this delay is taking the p*ss now. We seriously want some feedback on the status of GL3.

Mark Shaxted
03-06-2008, 12:49 PM
You'd have thought common courtesy would dictate that 'someone' would make some kind of announcement, even if it's simply "Hello, we're still progressing, don't worry!". A "Hello, OpenGL World" kinda thing :)

Actually, the lack of any feedback whatsoever is verging on downright rude, in my book.

zed
03-06-2008, 01:32 PM
Actually, the lack of any feedback whatsoever is verging on downright rude, in my book.
true, but they might be under a NDA
some NDAs dont allow u to even say that youre under NDA :)

the longer it goes on the more its looking like a legal issue + not an implementation issue

mark, ive refined the coords, so with the help of google earth u should know which house to drop the bomb on

Mark Shaxted
03-06-2008, 02:14 PM
Q to zed...

I was bored, and checked your location... Do you live in a field by a racecourse?

Zak McKrakem
03-06-2008, 05:06 PM
Since 1998 (AFAIK) until 2006 there was a Direct3D tutorial and an OpenGL tutorial at GDC (one day long, each one). Last year there was no OpenGL tutorial but there was a Khrons Group booth at GDC's Expo (not very well planned, in my opinion) where it is supposed that one day it was a talk about GL3 (it was a noisy place in the Expo). And there were a few talks about OpenGL in the conference.
This year it has been no OpenGL tutorial (Direct3D one was still there), there was no Khronos booth, there was no OpenGL talk.

So, if there is no OpenGL in the main game developer conference I don't see the 'promotional' advantages of the Khronos Group. The only thing I see is that there was something and now there is nothing.

Also, the IHV sold us, the developers, the advantages about the changes from ARB to Khronos like: being informed, the OpenGL newsletter, more feedback, etc. And, after the initial 'explosion', there is nothing.

Even the main IHV have decreased their developer OpenGL support in the recent years: you can compare the number of the OpenGL examples vs D3D examples in the NVIDIA SDK 8.5 vs the same comparision in the SDK 10.0. And how the tools they made those years are just D3D tools.

OpenGL is becoming some kind of religion. You have to believe without seeing.

knackered
03-06-2008, 05:38 PM
Let's hope jesus comes along real soon.

Korval
03-06-2008, 07:16 PM
Also, the IHV sold us, the developers, the advantages about the changes from ARB to Khronos like: being informed, the OpenGL newsletter, more feedback, etc. And, after the initial 'explosion', there is nothing.

You say this as though the current issues are Khronos's fault. I mean, it may well be, but thus far, nobody has any evidence to suggest that it is. Let's not forget that before the initial "explosion" of information, there was nothing. They'd stopped releasing the ARB meeting minutes, so we had no information about anything with regard to OpenGL.

pudman
03-06-2008, 08:29 PM
They'd stopped releasing the ARB meeting minutes, so we had no information about anything with regard to OpenGL.

So we've returned to the Standard Operating Procedure then. Obviously it doesn't work quite as well in these modern times (see Zak's GDC comments).

Well, there will always been OpenGL ES! Start porting your apps to mobile devices. In a few years the hardware will catch up.

k_szczech
03-07-2008, 07:14 AM
true, but they might be under a NDA
An NDA could be a reason enough for not talking about OpennGL behind employer's back, but newsletters were official - no NDA should be in conflict with the very fact that newsletter is released. It could at most limit the extent of information provided within newsletter to some ridiculous minimum, but the show must go on.

If some company starts working in secrecy, it is typical that they're trying to ensure everyone that they mean business. No company wants to loose their reliability by dissapearing without a word.

Korval
03-07-2008, 11:34 AM
An NDA could be a reason enough for not talking about OpennGL behind employer's back, but newsletters were official - no NDA should be in conflict with the very fact that newsletter is released.

He's talking about an NDA between the ARB/Khronos and someone external to it. The ARB would then be required not to talk publicly about that subject.

ruysch
03-08-2008, 07:48 AM
He's talking about an NDA between the ARB/Khronos and someone external to it. The ARB would then be required not to talk publicly about that subject.

Fair enough; however thats speculation. In all fairness Khronos/ARB would still be able to give an information update that: We are currently working on resolving external issues which have slowed down the development of OpenGL 3.0, we hope to resolve this in a nearby future.

That would not disregard any NDA. The lack of information is in my eyes arrogant.

Korval
03-08-2008, 09:45 AM
That would not disregard any NDA.

Unless that too was covered by the NDA. It could also be a legal issue, which is also frequently not discussed.

In the latter case, what happens is that someone suggests patent infringement or something. They then say that they're willing to negotiate terms, but the ARB must not talk about it. And if they do even so much as what you suggests, then the theoretical infringee will stop negotiating immediately and file suit.

Speculation? Certainly. But it makes more sense that "arrogance" (whatever that would mean in this context).

[ edit ]

Then again, there's this message (http://www.opengl.org/discussion_boards/ubbthreads.php?ubb=showflat&Number=235012&fpart=8) (scroll down about halfway) from Rob Barris, who is on the ARB. In it, he says:


It's not my place to provide any more details at this point other than to share my optimism that some deep seated conflicts are finally being resolved between the contributors on a number of levels, a trend I hope to see continue.

This suggests that the delay is purely due to internal issues within the ARB.

pudman
03-08-2008, 02:49 PM
It's official! OpenGL 3.0 is going to become a Ray Tracing API!

Quote: "Over time, I expect the graphics APIs will evolve to embrace ray tracing as part of the 3D graphics "bag of tricks" for making images." David Kirk, nVidia

Link here (http://www.pcper.com/article.php?aid=530). (I'm kidding, of course)

ruysch
03-08-2008, 04:31 PM
This suggests that the delay is purely due to internal issues within the ARB.

So lets pray that IP rights are resolved in time. Kind of funny that the benefactors of OpenGL seem doing all the right things to leave the industry with no choice but skipping OpenGL. They missed SIGGRAPH, they missed GDC, they really do not have that many chances left. Its hard to base products on a poorly defined standard.

Lets be optimistic as Rob Barris apparently is, perhaps we will see a OpenGL 3.0 Standard for SIGGRAPH 2008?

Zak McKrakem
03-09-2008, 04:09 PM
Come on! NDA? Ok. It is sure that there can be some kinds of NDAs but, continuing with the GDC topic, ATI, NVIDIA and MS talked about the future of D3D, they talk about the details that D3D 10.1 will have and about the things they plan for D3D 11, about how they see the future of the graphics hardware and what kind of features D3D 11 should have to support those features and what are the desires of the developers. They even ask for feedback about what the people will want. Sure, they canít give some kinds of details about their hw strategies and future products, but this is an obvious thing.
Are you telling me that no one can say anything about OpenGL? It is nonsense. To forget the game developer community is, in my opinion, not very clever.
Of course, this is just my opinion.

knackered
03-09-2008, 04:23 PM
game developer community??
do people still use GL for games?

k_szczech
03-09-2008, 04:39 PM
Ok, I've found out that there actually is an NDA, but ARB didn't read it well before they signed it. It begins with:
"Our goal is to ensure the expected future of OpenGL standard".
And later, there are 2 points written using very small font there:
1. You are to stop working on OpenGL
2. You are to stop talking about OpenGL

I've also heard that the lawyer, who prepared this NDA recently won a big prize in some small local contest organized by Microsoft.

Trust me at your own risk.

pudman
03-09-2008, 07:30 PM
First rule of Kronos, don't talk about OpenGL (unless it's ES).


do people still use GL for games?

I heard that the reason for the delay of Duke Nukem Forever was that they're waiting for OpenGL3.0.

ruysch
03-10-2008, 01:18 AM
They even ask for feedback about what the people will want.

I recall a few prominente members of the forums actually touring with Microsoft and presenting DirectX 10. Requesting feedback and discussing back and forth what should be next. These discussions where not at game conferences only actually there was a wide interrest in the future of DirectX.

Guess we will never see this kind of openness from OpenGL?

Lindley
03-10-2008, 05:30 AM
game developer community??
do people still use GL for games?

Where do you think all the Mac ports come from? Particularly the ones that come out simultaneously with the Windows versions, such as anything from Blizzard.

ruysch
03-10-2008, 07:45 AM
How fat a contract do you think Blizzard has with Apple? ;-)

V-man
03-10-2008, 09:34 AM
They even ask for feedback about what the people will want.

I recall a few prominente members of the forums actually touring with Microsoft and presenting DirectX 10. Requesting feedback and discussing back and forth what should be next. These discussions where not at game conferences only actually there was a wide interrest in the future of DirectX.

Guess we will never see this kind of openness from OpenGL?

Sure there is. People from nVidia/ATI have come here asking for feedback. In one case, they said we will get rid of NV_evaluators if nobody minds. They wanted feedback for EXT_framebuffer_object.
GLSL 1.20 is the result of some requests about having inverse an transpose modelview matrix.

knackered
03-10-2008, 11:24 AM
Where do you think all the Mac ports come from? Particularly the ones that come out simultaneously with the Windows versions, such as anything from Blizzard.
Somehow I knew someone would reference WoW.
Just browse this list and count the number of less-than-4-years old titles....
http://store.apple.com/1-800-MY-APPLE/We...re/action_games (http://store.apple.com/1-800-MY-APPLE/WebObjects/AppleStore.woa/wa/RSLID?nnmm=browse&mco=B60586C4&node=home/shop_mac/software/action_games)

Seth Hoffert
03-10-2008, 11:31 AM
game developer community??
do people still use GL for games?

I use GL for both games and visualization purposes, and I enjoy using it.

knackered
03-10-2008, 12:37 PM
which games have you written with GL? Maybe I've heard of them?

Seth Hoffert
03-10-2008, 12:41 PM
You haven't, as I haven't released them. I wrote a solitaire game a while back when I first learned GL, and my latest game is Mahjong.

EDIT: Random screenshot from my game:
http://cse.unl.edu/~shoffert/cardsfixed.png

Korval
03-10-2008, 01:11 PM
How fat a contract do you think Blizzard has with Apple?

Right, because the only reason that Blizzard would be willing to make a game for Macs would be due to a contract with Apple. It couldn't be because they simply want to make Mac games, or that it is in their financial interests to do so.

knackered
03-10-2008, 03:39 PM
WoW is a constant revenue stream - I dare say the development costs of the game itself aren't anywhere as significant as the infrastructure required to maintain the persistent world database, accounts, and credit card numbers. Worth hacking around with GL for a bit longer in order to get that stream of cash flowing.
Offline/non-subscription games, on the other hand, are relying on unit sales alone, so it's more important to get the dev work done as cheaply as possible.

Overmind
03-11-2008, 03:48 AM
WoW is not the only game by Blizzard.

Warcraft III, Warcraft II, Starcraft, Diablo II, Diablo are all available on Mac (although only Warcraft III uses 3D graphics, but that's not the point).

Starcraft II is going to be available on Mac, too.

None of these are generating constant revenue.

Seth Hoffert
03-11-2008, 06:24 AM
game developer community??
do people still use GL for games?

Perhaps you've forgotten which API many consoles use such as the wii, PS2, PS3, PSP, DS, and oh, cell phones. ;)

bobvodka
03-11-2008, 06:28 AM
At best those use OpenGL|ES, and some of those don't even use it (I've not seen any OpenGL code in the PS2 code base I'm currently working with) but might have an API like it.

stephanh
03-11-2008, 07:35 AM
Perhaps you've forgotten which API many consoles use such as the wii, PS2, PS3, PSP, DS, and oh, cell phones. ;)
PS3: Most devs use libgcm directly, OpenGL|ES is only there to get something working in a short amount time.

Seth Hoffert
03-11-2008, 07:45 AM
OK, cool - but still, what's wrong with using GL for games? It's a 3D API.

bobvodka
03-11-2008, 08:34 AM
I think the point knackered was trying to make was that most games these days are D3D, directly done via console's API or using a 3rd party renderer.

Blizzard's ports of games and iD are pretty much the last two commerical places I can think of who use OpenGL and many many amatures are using D3D for various reasons (including the D3DX library to make life easier when learning); heck right now I'd be happy to recommend D3D to a beginner because the API is that much simpler than OpenGL2.1 is.

OpenGL needs GL3.0 if it is going to stand any chance of getting this sector back... unfortunately all we have is a nice wall of silence...

*shrugs and goes back to working*

Seth Hoffert
03-11-2008, 08:52 AM
Oh, okay. :(

Hampel
03-11-2008, 10:06 AM
heck right now I'd be happy to recommend D3D to a beginner because the API is that much simpler than OpenGL2.1 is.

Especially the D3D10 API!

CatDog
03-11-2008, 12:25 PM
So D3D10 fulfills what GL3 promises? (No rant! I really don't know, yet.)

CatDog

Seth Hoffert
03-11-2008, 12:31 PM
Meh, just get a real GL GPU (NVIDIA) and use the extensions. :)

speedy
03-11-2008, 02:19 PM
Count up +1 commercial place using OpenGL for games - we just recently had a German release, and the worldwide launch is upcomming:

Perry Rhodan - 3d-io.com (http://www.gamestar.de/aktuell/charts/verkaufscharts/ currently selling @ 6th place in Germany)

ps. there is also one big UK based studio I know of which uses OpenGL.

Current state of affairs is - if AMD/ATI exposes advanced HW features as extensions, we'll stay on OGL for the next production.

Fingers crossed. :)

PaladinOfKaos
03-11-2008, 02:36 PM
Introversion (http://introversion.co.uk) uses OpenGL for the Linux and MacOS X versions of Darwinia, at least. There's also an OpenGL renderer on Windows, although it defaults to D3D with the latest patch.

knackered
03-11-2008, 02:51 PM
And we all know why it defaults to d3d - at least if you've got an ATI card.
Darwinia! really...a 2005 game...next you'll be holding quake 2 up as an example.

Jan
03-11-2008, 04:48 PM
Ah, nice to know what people are actually doing. I already heard about "Perry Rhodan" through the GameStar-magazine, although i'm not much of an adventure fan.

Jan.

RenderBuffer
03-11-2008, 08:07 PM
They've decided to scrap it, and go with OpenD3D10 as a result of Microsoft's new policy on open source implementations of their protocols. ;)

I've re-considered my take on it being legal issues. Longs Peak is an API change to existing functionality, and I have trouble believing that that sort of reorganization over existing functionality could cause lengthy legal problems.

As suggested by earlier posts, I think the thing that "went wrong" happened some time ago between ARB members, and they've become less and less willing to work together.

With a little luck we'll be updated with a news letter soon, but until then we just have to be patient.

Lindley
03-12-2008, 07:59 AM
WoW is not the only game by Blizzard.

Warcraft III, Warcraft II, Starcraft, Diablo II, Diablo are all available on Mac (although only Warcraft III uses 3D graphics, but that's not the point).

Starcraft II is going to be available on Mac, too.

None of these are generating constant revenue.


If you're going to be exhaustive you can't forget Warcraft 1.

And for my own part, I don't mind 4-year-old games. Gives me some breathing room before I have to upgrade my 6-year-old TiBook.

Ilian Dinev
03-12-2008, 09:03 AM
Count up +1 commercial place using OpenGL for games - we just recently had a German release, and the worldwide launch is upcomming:
Perry Rhodan - 3d-io.com
Looks awesome! We have to put this up on this site, I think - it can really lift spirits up here. To a newbie like me, thanks to old tutorials/links in the "Documentation" and "Coding Resources", and links to silly small things like Tetris everywhere, I almost gave up GL2.1 .
[Though, I'm not developing PC software/games for a living, I just want to make some free middleware or poke current-gen fun, as ogre/torque have too much backwards-compatibility in the way - so staying with nVidia is okay for me].
I used to like ATi (for image quality) when I was a PC gamer in the SM2 era... the irony.

zed
03-12-2008, 01:25 PM
do people still use GL for games?
a more pertinent question

is do ppl still use the PC for games?

true theres lotsa casual game use (webbased flash etc) but i dont really see them generating much cash

i often hear pc gaming is still big in europe, germany is often held up as an example as one of the countries where the PC still thrives WRT gaming.
In germany in 2002 PC+console games sales were equal, 5 years later PC sales are pretty much the same, console sales have doubled in the same time frame!

PC gaming like mac/linux is basically a niche product, true theres value being a big fish in a small pond but still

CatDog
03-12-2008, 03:39 PM
Germans really like niche products, I believe. For example role playing games with 100s hours of gameplay. Adventures, old fashioned point and click, have their market here. And Germans like high tech overkill: things like Crysis on Quad-SLI...

And I don't think that this will change. Well, of course this is my personal observation only.

CatDog

Korval
03-12-2008, 05:06 PM
PC gaming like mac/linux is basically a niche product, true theres value being a big fish in a small pond but still

I'd love to be in a niche like Blizzard with WoW. That's really niche, with 10+ million people paying an average of $10 a month (less than $15 because Chinese prices are per-hour, and much lower).

Just because one market is smaller than another doesn't make it a niche.

Jan
03-12-2008, 07:10 PM
Well, but Crysis sells just as bad in Germany as in the US. And nobody has a Quad-SLI system here. Even dual-SLI is just a myth. Germans might invest more money in their PCs, but not THAT much more.

And about the role-playing games: Baldur's Gate, Neverwinter Nights, Mass Effect, they all sold just as well in the US, so i wouldn't call them a niche product. Sure, there are country-specific favorites, but that's normal. And since the German market is actually quite big, one can even sell them at decent numbers (like "Die Siedler" or all those strange/boring Soccer-Managers).

Ever tried playing an Ego-Shooter with the XBox360-Controller? "Disaster" is the first word that comes to mind. In my opinion the mouse/keyboard-combination is the best argument for playing games with the PC and not a console. Since we all have PCs anyway, why would i want to buy a console where i'm forced to use that abomination of an input-device?

Jan.

Ilian Dinev
03-12-2008, 09:16 PM
Jan:
Crysis sold 1 million copies. UT3 barely 100k PC copies iirc. Meanwhile PS3/xbox FPS games sell 2..14 million copies.

I'd say the keyboard/mouse combo is an abomination of an input-device. But it simply depends on the type of game. Games that are horror without a DualShock: JRPGs, mechs/aircraft/racing (you must have analogue input!), third-person-shooters/action, ....
About the only genres that need a k/m are FPS and RTS, and games with dialogs like in Diablo2 (huge item-stash to navigate).

Yet even though FPS games need a k/m, they sell better on consoles. Maybe it's got something to do with gamers' continued frustration that to play a new game on PC, you either need to buy new hardware to avoid constant framedrops, or you have to dumb its settings down a lot (yet still getting inconsistent framerate). Many many gamers have voiced this same opinion/reason to move to consoles.
Btw, I don't really like FPS with a gamepad, so I had to solve that problem: http://youtube.com/watch?v=8MgROs03ykM ^^". There's also FragFX.

zed
03-12-2008, 11:28 PM
I'd love to be in a niche like Blizzard with WoW. That's really niche, with 10+ million people paying an average of $10 a month (less than $15 because Chinese prices are per-hour, and much lower).
http://en.wikipedia.org/wiki/Niche_market
ild define a market which comprises ~10% (+ shrinking) as a niche market, its certainly not the mainstream. look at epic/ID etc theyre focusing more on consoles.
PCs are more strategy/rpg.
anyways WoW also uses opengl so it makes the market percent amount of solely d3d pc games even smaller.
u could also argue the (*)iphone (which i learnt today supports opengl es) as also a vaiable game platform, which im sure it is, but is still a small pond.
http://www.joystiq.com/2008/03/06/spore-touch-fighter-shown-on-iphone/

(*) + perhaps the new ipod

EvilOne
03-23-2008, 08:02 AM
Hmm, no easter egg.

CrazyButcher
03-24-2008, 02:25 AM
it will probably be siggraph08 until we hear something

Hampel
03-24-2008, 06:00 AM
and I would guess nothing before siggraph10 and then they come up with fahrenheit again...

EvilOne
03-24-2008, 06:33 AM
Hahaha. Damn, the only thing we need is GL ES 2.0 plus multiple render targets... Instead of a redesign, a clean up would be okay.

@CrazyButcher Greetz from Erfurt.

k_szczech
03-26-2008, 04:36 AM
I wonder how many people will say "OpenGL 3.0 is comming out tomorrow!" on April the 1st...

bobvodka
03-26-2008, 05:38 AM
They should release it April 1st, just to throw everyone :D

k_szczech
03-26-2008, 08:44 AM
An April 1st release?
Let me guess, big file containing: "Knackered was here! Knackered was here!..."? :D

knackered
03-26-2008, 09:33 AM
Nah, I'd lay a bet someone just renames the dx10 chtml to "GL3" and uploads it somewhere.
This is just so rude that nobody from the ARB/Khronos has made any kind of statement about what the hold up is. This is not a professional approach to take.

k_szczech
03-26-2008, 09:54 AM
Especially that they stated there will be 4 newsletters every year, but we got only 2 in 2007.

Can't we really get at least a "We're not dead yet" kind of information?

bobvodka
03-26-2008, 12:36 PM
Maybe they are dead?
Nerve gas released into the face-to-face meeting by some unknown 3rd party....

elFarto
03-26-2008, 12:45 PM
...by some unknown 3rd party....
Microsoft...?

Humus
03-26-2008, 02:34 PM
it will probably be siggraph08 until we hear something

I fear that you're right. I'm losing my faith in OpenGL to be honest. For the last year or so I have barely done anything in OpenGL. It's been DX10 mostly. For someone that used to be pretty much an OpenGL fanatic I'm finding that I care less and less by the day. It doesn't matter how elegant the API gets in the end if it's delayed until it gets irrelevant.

bobvodka
03-26-2008, 03:45 PM
*nods*
I'm with Humus on this one (although without the DX10 related stuff as yet, I'm instead directing my attention towards musical things instead); heck the delay would at least be toleratable if someone said something or gave us a vague clue as to when things might appear. The fact that at SIGGRAPH07 they were 'nearly ready' and not it is nearly 6 months on is just meh.

ebray99
03-26-2008, 04:09 PM
You know... I think anyone (and by extension any IHV) that works with Khronos *knows* that the current silence is seriously hurting OpenGL. That said, this makes me wonder if a company or two is trying to strong-arm Khronos into this situation with the intent of damaging OpenGL's reputation or killing it outright. I have a really bad feeling about this situation since the clues all seem to point to a fundamental problem with the process (Khronos) and not the new API itself.

Kevin B

knackered
03-26-2008, 04:27 PM
It might be more productive if we start petitioning Microsoft to add quad buffered stereo support to Direct3d instead of worrying about the future of OpenGL. I'm stuck with GL for that one reason at the moment. Cross platform is not a consideration.

Lindley
03-27-2008, 05:46 AM
...by some unknown 3rd party....
Microsoft...?

Usually a good guess.

Jan
03-27-2008, 06:38 AM
You are going round in circles.

CrazyButcher
03-27-2008, 06:56 AM
didnt you say that before, Jan ;)

M/\dm/\n
04-15-2008, 02:07 PM
Is the GL3 out yet?

Lord crc
04-15-2008, 03:15 PM
Nope, still not a word...

k_szczech
05-09-2008, 05:40 AM
People allready talk about switching to DX10. In a couple of weeks we will have new threads in this forum with people sharing their experiences after switching to DX.
In a couple of months we will have regular D3D discussion board here. This will be actually a D3D forum for ex-OPenGL programmers.

Jan
05-09-2008, 10:06 AM
Yeah, but the discussion board would still have a point: It would be for people who have grown up with OpenGL but where then forced to switch to D3D. So whenever someone has a problem with some feature, because he is not used to D3D, this community can help him out better, than on regular D3D boards, since we share the same background knowledge.

Of course in about 10 years the board might be shut down, because the community will have vanished over time.

Jan.

Eddy Luten
05-09-2008, 10:15 AM
If this turns into a D3D board, it will be shut down as soon as possible -- that's what I would do anyway. Regardless of that, it would be great to hear some opinions on D3D10/Vista since I've heard too many bad things about Vista to jump the XP ship just yet.

bobvodka
05-10-2008, 02:44 PM
I've been using Vista since a couple of months after release (I accidently forced myself to it when I installed while very tired and didn't read the on screen prompts correctly and wiped out my XP install ;)) and aside from some iffy sound drivers from Creative during the first 7 months or so I've never liked an OS more and I've used DOS, 3.0, 3.1(1), Win95/98/ME/2K/XP and a few variation on linux for a short period of time.

I wouldn't recommend it for a laptop yet, as laptop drivers tend to lag behind desktop, but on the desktop it is great.

As for D3D10; I've started reading about it and, tbh, based on what we originally knew about the GL3.0 specs they look very simular API wise.
- You create resources
- To use them you create 'views'
- You then bind these views into the pipeline to use

All validation is done 'up front' instead of at bind time, as per the GL3.0 'teaser' we had oh so long ago.

LogicalError
05-12-2008, 02:13 AM
http://sciencenews.org/view/generic/id/31927/title/The_undeciders_

"English historian Cyril Northcote Parkinson observed in the 1950s that decision making is severely impaired in committees of more than 20 people."

...maybe that's why GL3 is taking so long? too many committee members?

EvilOne
05-12-2008, 05:29 AM
I'm one of those people switching to D3D. Not to D3D10, but to D3D9 (which has all the stuff I need). I have abstracted my renderer and keep the GL path only for "nostalgic" reasons (not really stable, just for fun)... There are some nice points:

1.) No more trial and error, hitting the fast path is so damn easy... Just query some caps, create render targets, load shaders, have fun.

2.) It works on ATi and Intel out of the box, no more guesswork.

3.) See point 2.

4.) HLSL just rocks...

I'm quite happy with my backend now. Fast and stable.

V-man
05-12-2008, 06:06 AM
It works on ATi and Intel out of the box, no more guesswork.

I have considered adding D3D9 path and it's a strong possibility that I will.
About Intel and D3D, it is still guesswork. Someone who does tech support for Half-life2 and the other products says that whenever someone has a bug and has Intel, the answer is always the same: get a ATI or nVidia.
With ATI+D3D, it is a perfect combination.

To your list I would add
5) software rasterizer
6) PIX and other tools for profiling.
7) 90something of games are on D3D which makes HLSL more utilized, less buggy.
8) I have done what I can to get good performance with GL. I'm wondering if D3D can do better.

Ashkan
05-12-2008, 06:30 AM
People allready talk about switching to DX10. In a couple of weeks we will have new threads in this forum with people sharing their experiences after switching to DX.
In a couple of months we will have regular D3D discussion board here. This will be actually a D3D forum for ex-OPenGL programmers.

Statement:
Delay the product => loose the customers.

Corollary:
If you don't want to loose the customers, don't delay the product.

Maybe someone should make that a sticky or something to remind the committee what the outcome will be (in case they already don't know).

OpenGL 3.0 is a big joke. We've been waiting for that forever... and in case you're wondering, I'm so sick of "they'll make it right instead" statements.