PDA

View Full Version : OpenGL 3 Updates



Pages : 1 2 [3] 4 5 6 7

Mark Shaxted
05-02-2008, 10:02 AM
Is there some sort of 'beta' testing going on with NDAs in place? Are we going to get a suprise release imminently?

BTW (this being as good a place as any) what will the minimum hardware requirements be for GL3? Geforce FX/6000/7000/8000?

bobvodka
05-02-2008, 10:47 AM
DX9 class hardware is the lower limit, I think GFFX just about makes it in, ATI 9500 should be good as well.

And if they are NDA beta testing then this is the worse public relations stunt ever pulled; fine don't give out details but a wall of silent tends to make people jump ship.

Brolingstanz
05-02-2008, 12:17 PM
If the silence is due to a non disclosure agreement, I'd have to say I don't find non disclosure very agreeable at all.

CatDog
05-02-2008, 01:38 PM
But it doesn't mean you have to vanish into thin air, hm?

CatDog

pudman
05-03-2008, 09:37 AM
But would an NDA prevent one from saying "Yes, we are still actively working on {stuff} and yes there is an NDA." I put {stuff} in there in case it may no longer be called "OpenGL3.0", for example.

Even a simple statement like that would alleviate some of the tension on this forum.

Demirug
05-03-2008, 11:11 AM
Depends on the NDA. But it is not uncommon that a NDA is part of the NDA itself. In such cases you are not allowed to tell someone that there is an NDA.

Roderic (Ingenu)
05-05-2008, 02:17 AM
They are still actively working on the next OpenGL version.

Hampel
05-05-2008, 03:29 AM
If they would tell you something, the infos would have been already leaking into this thread, I think (or hope)...

Hampel
05-05-2008, 03:35 AM
They are still actively working on the next OpenGL version.

Any further infos, Roderic? :confused:

Jan
05-05-2008, 03:57 AM
Thank you Roderic. It is not much information, but as stated a hundred times before, every piece of info is better than none at all. Maybe you could get a job from the ARB as PR-manager... ;-)

My problem is, that switching to DX9 is not a step forwards and DX10 isn't possible, since i don't even use Vista myself. That really only leaves OpenGL 3 for me. All my renderer-development is stalled for nearly a year now, i'm pretty pissed.

Jan.

V-man
05-05-2008, 08:17 AM
Depends on the NDA. But it is not uncommon that a NDA is part of the NDA itself. In such cases you are not allowed to tell someone that there is an NDA.

It doesn't make sense. They announced it on many websites. Large companies became aware that it is being developed.
Articles appeared all over the place.
It is more likely they are trying to get companies on board but some are against it. For one thing, how will it be exposed under Windows? It requires MS to agree.

Leadwerks
05-05-2008, 09:54 AM
This has all the markings of a cancelled project to me. I hope I am wrong, but hoping for OpenGL 3 feels delusional under these circumstances.

pudman
05-05-2008, 02:21 PM
They are still actively working on the next OpenGL version.

Even from this tidbit we can speculate. I think between "canceled project" and my "{stuff}" we may be correct. If there is an OpenGL "3.0" it may be very different from what was described to us earlier. They may have decided to skip 3.0 when they considered the implications that a Mount Evans refactor would entail, given the insider knowledge of the advancement of hardware.

Its NDA status would IMO imply that the next OpenGL version will have revealing characteristics that hardware vendors are not quite ready to announce. Given that nvidia will likely reveal their summer refresh soon ( nvida media day (http://www.theinquirer.net/gb/inquirer/news/2008/05/03/nvidia-reschedules-editors-day)) we may see the NDA lifted soon as well.

If there was conflict behind the decision to refactor 3.0 they might have felt the NDA was necessary to prevent that conflict and the negative public effects it would create from more seriously affecting the process.

Lord crc
05-06-2008, 08:39 AM
This situation reminds me of these friends I knew back in the days, which always had THE most kick-ass 3d engine just around the corner for like 5 years or so. Of course they never got around to releasing more than simple tech demos...

Brolingstanz
05-06-2008, 11:18 AM
Siggraph '08 is but a few short months away now. We're bound to have heard something by then. If not, then we can send in Franky Knuckles and Elbows Malone to negotiate.

RenderBuffer
05-06-2008, 12:50 PM
Info for the Siggraph 2008 OpenGL class has been up for a long time, and I don't think OpenGL 3.0 will be ready for it.

From http://www.siggraph.org/s2008/attendees/classes/11.php

"This class presents an introduction to OpenGL, espouses best practices for performance and compatibility with future versions of the API, and provides a glimpse of OpenGL's future directions."

bobvodka
05-06-2008, 02:29 PM
Given the involvement of an ARM guy on that page it is logical to assume that the mobile domain is going to feature. Now, given that Bill Licea-Kane and Evan Hart are also 'on board' with this it would also imply desktop stuff.

Reading between the lines a little (and generally making stuff up) I can't help wonder if there has been some kind of speed up with the merging of OpenGL and OpenGL|ES which has been put forward before as something that would happen in the future. Maybe they have made changes to make that merge easier? Maybe there is 'unified' API in the works? Maybe I'm talking out of my arse and I should just get on and plan to use D3D10 for my next project...
(hey, for all it's "vista only"ness it is at least here, and it's not like you can use DX10 features on AMD hardware without it anyway)

mikethebike
05-08-2008, 04:58 AM
when will OpenGL3 be released? Around what date?
Thanks

Hampel
05-08-2008, 05:19 AM
when will OpenGL3 be released? Around what date?

The date is hidden in one of the 21 pages of this thread. Happy searching... ;)

Jan
05-08-2008, 06:29 AM
Lol... yeah, good luck. If you find the date, feel free to post it "again" :D

PkK
05-08-2008, 07:03 AM
We have gone from speculating about "when" to "if" some time ago.

Philipp

Eosie
05-08-2008, 11:28 AM
This section should be renamed Items of Ignorance.

bobvodka
05-08-2008, 11:51 AM
or OpenGL.org forum members rambling zone.

Brolingstanz
05-08-2008, 01:24 PM
Coincidentally, ignorance is sometimes fertile ground for some mighty fine rambling.

Eddy Luten
05-08-2008, 01:53 PM
Although ignorance would be eliminated with an update.. or a pipeline issue, or anything, really.
At the moment I'd be happy with "ok, we've canceled OpenGL 3.0 in favor of {X}", which although grim, would put aside the wild speculation. 7 months of nothing begs for some/any news.

bobvodka
05-08-2008, 04:54 PM
Personally, I've pretty much decided this weekend to take a serious look at D3D10; when I first learnt OpenGL it was D3D8 and the interface was horrible and OpenGL was a better option, D3D9 was an improvement and if D3D10 is better then I'll be using that from now on.

In short; I've had enough of waiting and I feel sorry for those who are stuck with OpenGL for whatever technical reasons they have :(

CatDog
05-08-2008, 05:10 PM
If you do this, please make a new thread an share your experiences!

Actually, that's what I wanted to do, but I'm far FAR from being ready to switch over...

CatDog

Brolingstanz
05-08-2008, 10:59 PM
Now, given that Bill Licea-Kane and Evan Hart are also 'on board' with this it would also imply desktop stuff.


Interestingly, their sections are listed as "Forthcoming". Also I don't see a listing for OpenGL yet in the BOF list. Must be something warming up in the wings...

k_szczech
05-09-2008, 05:44 AM
I have an idea - let's all switch to D3D!

Then, when OpenGL 3.0 is finally released, it will have to be extremely good!
Otherwise, regular D3D programmers like us will not even consider switching from D3D to OpenGL.

So, for the sake of OpenGL, we all have to switch to D3D now ;)

-NiCo-
05-09-2008, 05:55 AM
Sure, can you point me to the D3D SDK for linux? ;)

ZbuffeR
05-09-2008, 07:34 AM
And for Mac OS X ?

NeARAZ
05-09-2008, 08:58 AM
What about Amiiigaaaa?! :)

Eddy Luten
05-09-2008, 09:10 AM
Considering the shrinking gaming market for PC/Desktop games, I doubt many ISV will put much time into Linux and Mac. Which is quite sad, actually.

If you want to make profit you have to go with a popular platform (XBOX 360, PS3, Wii, PC). But Amiga? C'mon... That's a joke, right? How many people still use that platform?

Demirug
05-09-2008, 11:24 AM
Personally, I've pretty much decided this weekend to take a serious look at D3D10; when I first learnt OpenGL it was D3D8 and the interface was horrible and OpenGL was a better option, D3D9 was an improvement and if D3D10 is better then I'll be using that from now on.

The overall structure of Direct3D 10 is very similar to what we have seen from OpenGL 3.0 so far. That’s the reason why I had postponed any work on my Direct3D 10 to OpenGL wrapper.

The main difference from a high level look is that OpenGL 3.0 uses C bindings while Direct3D 10 is based on C++ interfaces (with a little bit of COM).

There are some pitfalls when it comes to porting Direct3D 9 or OpenGL 2.x applications that can totaly ruin your performances. I learned some of them the hard way while porting our engine from 9 to 10.

If you have any questions fell free to ask. But I am not sure if this the right place to discuss Direct3D details.

Eddy Luten
05-09-2008, 11:45 AM
I don't know if this is reliable at all but taken from one of the comments on my blog (http://scriptionary.com/blog/2008/05/08/the-ghost-of-opengl-30/#comment-43):

I’m on the ARB mailing list and I can assure you they are working hard for a release (I’m not a contributor for the spec though I just follow it). While I also feel they should released more news, things like patents hindering or someone wanting to kill OpenGL are totally false. Stop being paranoid :D. Some game developers are also actively contributing to the spec. There are a lot of difficulties with the coming up with a new OpenGL spec-they would like it to have a long life, so they try hard to revise the proposed object mechanisms so they interact well with other parts of the API. They have been tirelessly revising the specs to iron out the issues (no really, coming up with a proper design really *is* hard - don’t jump into conclusions like the new OpenGL is also ill-designed). They are doing a pretty good job with the current kind of organization- members from different companies contributing. Probably the best way would be to have some people working together in the same physical location with their whole time dedicated to it, but such a thing would be hard to realize.
Can anyone verify?

Brolingstanz
05-09-2008, 01:05 PM
Why don't we just ask Old Mother Hubbard what she thinks ;-)

Brolingstanz
05-09-2008, 01:10 PM
What about Amiiigaaaa?! :)

Amiga and Atari. Now there are 2 platforms steeped in OpenGL lore ;-)

Zengar
05-09-2008, 01:54 PM
Anyhow, if anyone from the GL3 team reads this, my advice is: change branding. Don't call it OpenGL 3.0 but instead invent a new name. The GL3 would be parting from ideas adn teh design of the current OpenGL; also, "OpenGL" is often connected with buggy and slow drivers in the minds of the customers. Lets be honest, the major driving force behind 3D technology is gaming and GL3 has to be (and hopefully is being) designed with this in mind; so if a new API is released, it needs a new name to indicate a new direction. Call is Open3D or NextGL or whatever.

tanzanite
05-09-2008, 02:01 PM
/.../ or NextGL /.../ Naah ... there is only one x and even that is a small one. Any other suggestions?

Zengar
05-09-2008, 02:12 PM
We should start a thread with best suggestions :p

PkK
05-09-2008, 02:16 PM
Use winelib if you want to make a native Linux App that uses the D3D API.
According to http://winehq.org/site/status_directx D3D9 is there, but no D3D10 yet.

Philipp

ZbuffeR
05-09-2008, 03:05 PM
Use winelib if you want to make a native Linux App that uses the D3D API.
According to http://winehq.org/site/status_directx D3D9 is there, but no D3D10 yet.

Philipp

What a stupid idea. Wine will talk GL to the video card, why would you want to convert you app from GL to D3D so that it will be dynamically converted to GL ???

To me OpenGL 3.0 is a fine name.
GL 2.0 should already have brought more changes, but it ended up as some sort of GL 1.6 actually.

Brolingstanz
05-09-2008, 03:28 PM
yeah I don't see a whole lot of "Direct" in that arrangement. Anyone can layer a thin abstraction over an existing API, but it is cool that they've made an attempt to do it for us. Heart seems like it's in the right place.

I like the numbered versions too. Be fine with me if they just followed the shader model numbers from here on in (GL3, GL4 (Mt. Evans), ...), but that's probably too frenetic a pace and could be somewhat misleading unless something fundamentally different is en route.

elFarto
05-10-2008, 02:06 AM
I sent an email to Michael Gold on an unrelated matter and asked if he could give an update on OpenGL 3, unsurprisingly he said:


I truly wish I could say something about OpenGL 3.0. I am well aware of the community's frustration. Unfortunately I am not authorized to make any statement at this time.
Well, at least they know we're frustrated...

Regards
elFarto

PkK
05-10-2008, 02:06 AM
[/QUOTE]
What a stupid idea. Wine will talk GL to the video card, why would you want to convert you app from GL to D3D so that it will be dynamically converted to GL ???
[/QUOTE]

-NiCo- asked for a 3D3 SDK for Linux. While OpenGL will be faster on Linux most of the time, winelib can be useful for programmers that don't want to use OpenGL (maybe they want D3D for Windows, since 3D3 drivers tend to be better than GL drivers there and they don't want to implement two rendering backends, whatever).

However I don't know how good winelib's D3D really is (I don't even know about the D3D API on Windows; So far GL 1.5 was good enough for all the graphics programming I did).

Philipp

ZbuffeR
05-10-2008, 02:58 AM
What a stupid idea. Wine will talk GL to the video card, why would you want to convert you app from GL to D3D so that it will be dynamically converted to GL ???


-NiCo- asked for a 3D3 SDK for Linux. While OpenGL will be faster on Linux most of the time, winelib can be useful for programmers that don't want to use OpenGL
It seemed to me that the point of the discussion was "switching to D3D because no news on GL 3". Using wine limits your possibilities to the intersection of the features from both GL and D3D, so it is useless in this case.
You are right, it may be fine for learning D3D, but I would not advise developing on a different implementation of Windows calls that 90% of your target audience...

About the completeness of wine, it is pretty good on the DX9 front, as shown by the recent games running on it :
http://www.youtube.com/watch?v=zwdtB8k9VuI
They are nearing a 1.0 version milestone.

Demirug
05-10-2008, 03:31 AM
However I don't know how good winelib's D3D really is (I don't even know about the D3D API on Windows; So far GL 1.5 was good enough for all the graphics programming I did).

Philipp


While the Direct3D Part of winelib is not perfect it already provides a high level of compatibility (hard and soft). Unfortunately there are some Direct3D constructs that are hard to map to OpenGL or doesn’t have driver support at all. As long as you don’t hit such corner cases porting is very flawless. Wine and CrossOver Games which both based on the same code base as winelib can run many Windows games without any changes at all. Anything else is work in progress and they are already planning for Direct3D 10 support. But from what I understand they were waiting for OpenGL 3.0 to decide if they want to translate the Direct3D 10 calls to 2.x or 3.0.

k_szczech
05-10-2008, 03:35 AM
Hey, guys - I was being sarcastic when I said we should switch to DX. Really :D

MZ
05-11-2008, 10:02 AM
I sent an email to Michael Gold on an unrelated matter and asked if he could give an update on OpenGL 3, unsurprisingly he said:


I truly wish I could say something about OpenGL 3.0. I am well aware of the community's frustration. Unfortunately I am not authorized to make any statement at this time.
Well, at least they know we're frustrated...

Well, at least we know they are frustrated about our frustration too... :)

That's something, in these hard times... For god's sake, we haven't had post from Korval on opengl.org for 2 months. If this isn't a sign of the end times, I don't know what is.

Groovounet
05-11-2008, 01:04 PM
I won't be sarcastic if I tall you I'm considering switching to DirectX, really. Tools, reliability, features, documentation? Is someone could point out a something where Direct3D10 isn't better? Right, Vista sucks but today if we start a new project... Maybe, D3D10 is where to go! Kronos groups is so fortunate that Vista sucks, that XBox360 is based on Direct3D 9, really!

Groovounet
05-11-2008, 01:13 PM
Don't we deserve to have some news, to now where to go, to stop wasting our time. Is it required to start a petition? Should we complain enough to make buzz? Kronos Group taking control of OpenGL was supposed to be a good thing, we have an answer, it sucks !!!!!!!!!!!!

bobvodka
05-11-2008, 04:06 PM
Right, Vista sucks but today if we start a new project... Maybe, D3D10 is where to go! Kronos groups is so fortunate that Vista sucks

Except; Vista DOESNT suck.
I've been using it as my primary OS since around 2 or 3 months after it was released, and aside from Creative's normal driver suckyness I've found it fast, stable and above all better than XP. This is running on a 2.2Ghz AMD X2 adn 4gig of RAM, the same system which was running WinXP x64 and WinXP before it.

Zengar
05-11-2008, 04:22 PM
Second that...

Still, I am switching to a Mac anyway :)

Jan
05-12-2008, 06:58 AM
"Kronos groups is so fortunate that Vista sucks, that XBox360 is based on Direct3D 9"

Just what i have been saying for months, but time is running out.

And even if OpenGL 3 will be the most kick-ass spec that one could imagine, it has another problem: People KNOW its story. People will avoid it, simply because they know that the ARB is so slow and unreliable. There is a future for OpenGL, simply because of the platform independence, but in the gaming market i fear it will not become popular, since there is no driving force behind it, that companies can rely on.

And then there is also the driver-question, of course.

Jan.

elFarto
05-12-2008, 11:45 AM
Just hang on there a few more months guys.

Regards
elFarto

Soth
05-13-2008, 11:50 AM
http://www.opengl.org/events/details/siggraph_2008_los_angeles_california/

Siggraph 2008 - Los Angeles California

OpenGL 3 Updates
Don’t miss the great updates on OpenGL3 at the SIGGRAPH BOF!

speedy
05-13-2008, 12:35 PM
Fingers crossed. :)

zed
05-13-2008, 03:03 PM
http://www.opengl.org/events/details/siggraph_2008_los_angeles_california/

and they say graphics is a male dominated field.
3 months to the unveiling
cant say i care to much WRT games, ps3/xb360 are only about ogl2.0/d3d9 level, so in reality unless u foolishly limit yourself to pc only, ogl3.0 is unneeded for a few years

btw dont discount the mac, true for years it+linux have been a niche market with 2-4% share each, but its really taken off in the last couple of years,
i predict within 5 years, WRT ppl accessing the internet through windows will be in the minority ie <50%

knackered
05-15-2008, 11:30 AM
and they say graphics is a male dominated field.

I can certainly see some tits in there.

Soth
05-15-2008, 11:39 AM
and they say graphics is a male dominated field.

I can certainly see some tits in there.
Still it`s mostly cocks than pussies

knackered
05-15-2008, 12:18 PM
have we been reduced to this? arguing and bickering amongst ourselves?
THIS IS WHAT THE KHRONOS GROUP HAS DONE TO US!
Damn their eyes.

Jan
05-16-2008, 01:41 AM
Soth: Get off the board, people who post such things don't belong here. When knackered posts such a comment that isn't ok, but after over 2700 posts and many years in this community, it is a different thing than for a newbie like you. If you only have offending things to say, just don't post anything at all.

Jan.

Zengar
05-16-2008, 02:03 AM
Where are the mods anyway?

Hampel
05-16-2008, 02:29 AM
I think, the mods are under NDA, too, and cannot react to statements done in the forums... :p

V-man
05-16-2008, 08:00 AM
http://www.opengl.org/events/details/siggraph_2008_los_angeles_california/

and they say graphics is a male dominated field.
3 months to the unveiling
cant say i care to much WRT games, ps3/xb360 are only about ogl2.0/d3d9 level, so in reality unless u foolishly limit yourself to pc only, ogl3.0 is unneeded for a few years

btw dont discount the mac, true for years it+linux have been a niche market with 2-4% share each, but its really taken off in the last couple of years,
i predict within 5 years, WRT ppl accessing the internet through windows will be in the minority ie <50%

IMO, GL 3.0 has been needed for a long time. I don't know what the final GL 3.0 will be but what is needed is cleaned up API that is up to date and VERY simple to make drivers for. Let me explain:

If you download some of the early driver version for your ATI or nVidia, you'll see that there is no GL_ARB_texture_float and no GL_EXT_framebuffer_object and some other extensions.

Compare GL 2.0 with D3D9c, those same features are core in D3D and work very well.
I believe that until now, on ATI you can't make a FBO that has a stencil buffer. Correct me if I'm wrong.
D3D10 is a cleanup up API with all it's new features. Those same features are not core in GL 2.1
They are not offered on ATI at all.

Well, it may be that some ARB players have not been cooperative but bottom line is GL is behind D3D since a few years now.
I don't think the gaming industry will switch APIs. The next gen game for Christmas will likely have DX9 and DX10 backend.

Soth
05-16-2008, 11:42 AM
Soth: Get off the board, people who post such things don't belong here. When knackered posts such a comment that isn't ok, but after over 2700 posts and many years in this community, it is a different thing than for a newbie like you. If you only have offending things to say, just don't post anything at all.

Jan.



Since when cocks (http://ru.youtube.com/watch?v=9Mf65T-gz5o) and pussies (http://ru.youtube.com/watch?v=-yYNgRHs8EU) became offending things?? BTW i think that you are more offensive than me.

Brolingstanz
05-16-2008, 12:30 PM
I don't know what the final GL 3.0 will be but what is needed is cleaned up API that is up to date and VERY simple to make drivers for.

Hear hear.

bobvodka
05-16-2008, 01:05 PM
Just hang on there a few more months guys.

That's been the story of OpenGL; "wait a little longer! good things are coming we promise!"

Last time they said something like that we got GL2.0; an API striped of all the good ideas and was nothing more than GL1.6.

And the problem is after a while people get sick of waiting and wander off else where to see what else they can do...

zed
05-16-2008, 01:37 PM
its official pc gaming is dead,
alex st john declares consoles are dead
http://www.gamasutra.com/php-bin/news_index.php?story=18685

like i say, unless your focus is on a niche, ogl3.0/d3d10 are irrelevant (at least until 2010+)

WRT pussies
http://www.youtube.com/watch?v=SKehkKqRlpc
this show prophesied the advent of the paris hiltons etc of the world

samantha
05-16-2008, 01:51 PM
Behave, boys.

Brolingstanz
05-16-2008, 06:24 PM
Just hang on there a few more months guys.

That's been the story of OpenGL; "wait a little longer! good things are coming we promise!"


The good news is that at least we now have a definitive date for an update, which is really what most folks are clamoring for.

Now if everyone can make it through the summer without something to complain about. My right eye is already beginning to twitch a little...

<relevant joke>
This guy says to his shrink, "Doc, it's my brother... he thinks he's a chicken.
Doc says, "So, why don't you turn him in?"
Guy says, "I would but I need the eggs."
</joke>

Eosie
05-16-2008, 10:02 PM
I believe that until now, on ATI you can't make a FBO that has a stencil buffer.
You can. There is EXT_packed_depth_stencil, really. ;)

dor00
05-19-2008, 03:32 AM
Any chance to get a beta or alpha to play with in mean time?

bobvodka
05-19-2008, 04:08 AM
There isn't even a spec yet....

LogicalError
05-19-2008, 04:36 AM
There isn't even a spec yet....

excuses...

bobvodka
05-19-2008, 05:41 AM
well, no.
Without a spec it's pretty hard to have an alpha version :P
Even if you installed a driver tomorrow with GL3.0 support it would be no good as you wouldn't know how to use it ;)

Now, if he meant 'alpha spec', well that just ain't gonna happen as Khronos doesn't roll that way. And even with an 'alpha' spec you'd still have no implimenntation to play with.

k_szczech
05-20-2008, 05:10 AM
@bobvodka - I believe LogicalError was simply sarcastic :)

LogicalError
05-21-2008, 04:35 AM
@bobvodka - I believe LogicalError was simply sarcastic :)

and k_szczech gets a cookie (no cookie for you bobvodka, sorry!)

bobvodka
05-21-2008, 05:23 AM
:(

Mars_999
05-22-2008, 08:24 PM
Where is GL3.0! :) LOL ok in all seriousness, how many of you are thinking of moving to DX10? Curious to know how bad Khronos is killing off GL developers. My problem is OSX/Linux will never have DX and I can't see Apple, Sony, does Nintendo use GL? allowing DX getting ahead of GL, then there would be a graphics difference. As of now ATI sucks for GL, where Nvidia has opened up much of DX10 features through extensions. Anyway just everyone hold out till Sig08 and see what happens...

dor00
05-23-2008, 12:04 AM
Where is GL3.0! :) LOL ok in all seriousness, how many of you are thinking of moving to DX10? Curious to know how bad Khronos is killing off GL developers. My problem is OSX/Linux will never have DX and I can't see Apple, Sony, does Nintendo use GL? allowing DX getting ahead of GL, then there would be a graphics difference. As of now ATI sucks for GL, where Nvidia has opened up much of DX10 features through extensions. Anyway just everyone hold out till Sig08 and see what happens...

All good and nice, but, the problem is I DONT CARE about directx, dx10, dx11 or whatever you want from Microsoft.

bobvodka
05-23-2008, 03:08 AM
You probably should care as that's a major driving force behind what we get graphics wise.

At the end of the day what graphics API you use shouldn't be an almost relgious issue it seems to be; use what you need to get the work done. I want to work at a D3D10 level of hardware right now and for that the D3D10 API is cleaner and more widely supported (afaik AMD still don't have the same D3D10 features exposed as NV do).

Once GL3 makes it appearance, based on the previews, it looks to be widely the same as D3D10 entry point and general use wise, as such writing a backend which targets it should be reasonably trival.

End of that day, it's only an API and I just wish people would get over it.

Jan
05-23-2008, 03:23 AM
You're right, it's only an API. And as said by several people, many times before, switching to another API is not the problem. People don't care THAT much about which API they use. The only problem is, that D3D10 is Vista only and many people (including those on this board) don't use Vista, some don't even use Windows, at all.

So in the end, for many people (including me) it is not a decision based on which API we prefer, but for what platform we want to develop.

To reformulate your last statement:
It's only an API, i just which the ARB would get it done!

Jan.

Brolingstanz
05-23-2008, 03:38 AM
Zip-a-dee-doo-DAH, zip-a-dee-ay
My, oh my what a wonderful day!
Plenty of a-PI heading my way
Zip-a-dee-doo-DAH, zip-a-dee-AY!

dor00
05-23-2008, 04:18 AM
You probably should care as that's a major driving force behind what we get graphics wise.

At the end of the day what graphics API you use shouldn't be an almost relgious issue it seems to be; use what you need to get the work done. I want to work at a D3D10 level of hardware right now and for that the D3D10 API is cleaner and more widely supported (afaik AMD still don't have the same D3D10 features exposed as NV do).

Once GL3 makes it appearance, based on the previews, it looks to be widely the same as D3D10 entry point and general use wise, as such writing a backend which targets it should be reasonably trival.

End of that day, it's only an API and I just wish people would get over it.

Mate, please, understand, in the begining of the day, end of the day, what period of time you want from the "DAY", I DONT CARE about DirectX at ALL. Linux dont have DirectX. Do you understand now? Vista, dx10, all Microsoft can give me even flying cows, if they dont do that native on linux, I DONT CARE. POINT.

bobvodka
05-23-2008, 04:53 AM
You should still care, if only from a features point of view.
Sure, you aren't using Windows and you'll never use DX10 however when it gets here you'll still be using DX10 features so, unless you want to live in a pit of igorance with regards to features, at least knowing what D3D10 supports is worth while.

Jan
05-23-2008, 01:11 PM
"at least knowing what D3D10 supports is worth while"

Reminds me of the comment from some ARB guy, when they were talking about how they are designing gl 3.0. He said, he doesn't want to make it "like D3D" and as an argument how serious he was, he stated, that he never even took a look at D3D10.

He meant that in a positive way (not being tied to MS or whatever), but i was surprised, how he could claim that IGNORANCE could ever be positive. I mean, how can i develop a competitive API (in time, haha), if i have never even looked at what others have done. It's not like you get spoiled by it or brainwashed.

Anyway, i assume most ARB guy DID take a look at D3D10, at least by now. Maybe they did 6 months ago and decided, that their current design for OpenGL 3.0 wasn't good enough ;-)

Jan.

CatDog
05-23-2008, 01:52 PM
He meant that in a positive way (not being tied to MS or whatever), but i was surprised, how he could claim that IGNORANCE could ever be positive.
Oh, new innovations need this kind of ignorance. To find new ways it's always the best not to know about the old ones. Danger is that you propably accidentally reinvent the wheel. Solution is to have somebody by your side who knows the old ways very well, but tells you only if you start to stumble on them.

Anyway, I agree that this shouldn't apply to OpenGL 3!

Because another danger when searching for new ways is to get lost without a trace. ;)

CatDog

bobvodka
05-23-2008, 01:53 PM
Well, from my usage thus far and based on the previews for GL3.0 it is VERY close to D3D10.

Not in function names etc, but over all API design wise; immutable objects, views to say how data is used etc.

Given D3D's refreshes it allows it to stay closer to the hardware, so it would make sense than D3D10 operates 'best' for the hardware around now and in the future, as such OGL3.0 is going to have practically the same interface.

Michael Gold
05-23-2008, 02:52 PM
He meant that in a positive way (not being tied to MS or whatever), but i was surprised, how he could claim that IGNORANCE could ever be positive.
I believe the ignorant ARB guy actually meant that he didn't want to be accused of plagiarism, and felt that he could design a clean, efficient API based on years of developer feedback, implementation experience and familiarity with past, current and future hardware without using another's work to seed his own creative process.

Zengar
05-23-2008, 03:04 PM
And, where are the results of his creative process? :p

Michael, no offense meant, but we really want to see some results... we were promised them more then half a year ago!

Brolingstanz
05-23-2008, 03:28 PM
Man this is a tough crowd! It's like the Trembling Hills Asylum on karaoke night ;-)

Mars_999
05-23-2008, 05:12 PM
I agree, an API is an API, but that isn't the issue with GL vs. DX. It comes down to what you want to do with each, for me I want my engine to run on everything, not just Windows. I would rather be on OSX anymore, but for now I am on Windows. This is why I use SDL/GL. There isn't any compelling reason that I know of to use DX over GL, other than you like MS better. Why do you say this you ask? Well glad you asked, because most people say its just an API, and it just used to get work done, then why not use an API that hits all platforms and make more money, or support more users... Using an API that paints yourself into a corner, is for sure the most logical solution! LOL

I just don't see GL going anywhere with companies like Apple, Sony, and others using it for their Gfx system API. Unless raytracing takes over its going to be here. I just wish they would throw out a status update. Sad fact I can already hear it now, GL3.0 come and get it, its available to use, "What!, this is crap I waited around for this and this is what I get for waiting so long, bullcrap" storms off mad.... I am already waiting for it.

Brolingstanz
05-23-2008, 05:59 PM
Well, I think what we need are free OpenGL hats and coffee mugs.

Michael Gold
05-23-2008, 09:00 PM
And, where are the results of his creative process? :p

Michael, no offense meant, but we really want to see some results... we were promised them more then half a year ago!
Its a fair question, and I take no offense to any of the comments made here in frustration. Come to the OpenGL BoF at SIGGRAPH, and you will find the answers you seek.

RenderBuffer
05-23-2008, 09:21 PM
Woooo!! Thanks for letting us know!

Andrew.

Jan
05-24-2008, 12:01 AM
Woohoo! Now i can claim i successfully provoked a response :-P

Michael: I just want to make sure you know this wasn't meant in any way personally. I understand how you meant it, now, whether i agree i'm not sure, yet. Maybe, as pointed out above, it is a good thing to have people in the team, that are not influenced by the work of others.

Back then when "the ignorant ARB guy" ;-) made that comment, i didn't reply to it, because i didn't want to spoil the party-feeling. Well, it's gone now anyway. :-(

Puh, Siggraph is soo far away! There is a counter on this page:
http://www.siggraph.org/s2008/

78 days left! What are we going to do all that time? And more important, will there be drivers ready, when gl comes out, or do we need to wait another year for them? (i realize now, i have actually given up to conceal my frustration on this whole thing)

Jan.

Zengar
05-24-2008, 02:00 AM
Well, Michael's answer can be interpreted in a number of ways... It could be "we are releasing at siggraph", but somehow I get a feeling it is merely a "you will get an update at siggraph" :p I will be really surprised if we see GL3 released in 2008

elFarto
05-24-2008, 02:44 AM
Well, Michael's answer can be interpreted in a number of ways... It could be "we are releasing at siggraph", but somehow I get a feeling it is merely a "you will get an update at siggraph" :p I will be really surprised if we see GL3 released in 2008
Judging by the language he's using (here and in private emails) and the time that's passed, I would be very surprised if it's just an update.

I'm hoping the extra time they've had to finish it is to get a driver written. Because let's face it, if the spec was released with no drivers, we would be equally disappointed.

"Great we've got a new spec...but we can't use it..."

Of course, if they just release the spec, and say you'll need to wait 6 months for a driver, that's going to annoy a lot of people.

And if it's just an update, I think people will seriously lose all faith in the ARB/Khronos' ability to deliver updates to OpenGL.

Regards
elFarto

bobvodka
05-24-2008, 03:38 AM
tbh, no; if they have spent say 6 months writing drivers and keeping the specs under wraps I personally would have prefered the specs 6 months earlier.

Like others here I work 8h a day, as such having that extra 6 months during my down time to look over the API and see how it goes together would have been a great bonus as when the drivers appeared we could have hit the ground running.

It also wouldn't have seemed like everything was dead for a year either and pretty much all the frustration in evidence here wouldn't have existed; granted there would have been some towards the lack of drivers but at least every expects lag time on those.

Anyway, as a counter to Mars_999's point about 'getting the biggest sales', frankly if you wanted to do that you'd go and use XNA with C#; you'd hit windows and the XB360, which is a MUCH larger install base than Win+Linux+OSX combined.

Granted, if you designed your backend properly this shouldn't be an issue anyway, you'd just use the API you wanted on the platform of choice. It's certainly doable, I cite the recently released Penny Arcade game as an example of 'total' crossplatform release.

So, the choice of API really doesn't matter, what matters is your ability to code and abstract a decent backend.

Mars_999
05-24-2008, 05:08 PM
Hold up, XNA isn't a good choice, due to you can't sell the game and make money on it, unless you find a publisher and best of all its limited to DX9 hardware. So no GL3.0/DX10 features. If you are talking LARGEST SALES, then windows isn't it, it's the WHOLE PC industry which is the largest unit, which includes Win32, MacOS, Linux/Unix, and others. Remember GL isn't just for games, but other fields use GL for their graphics.

Brolingstanz
05-24-2008, 08:14 PM
Why is everyone so keen on world domination?

I'd be happy with a modest slice of any single pie (as long as it's not the blueberry buckle). I think indie developers would do well to get a foothold on a familiar platform and go from there... get some interest, success, some monetary flux coming in, then think about allocating the time and money porting to other platforms.

Cost benefit analysis for large companies is no doubt a different ball of wax, but for the little guy to worry about making his first commercial venture global seems just a tad ambitious, if not a little silly.

dor00
05-25-2008, 01:00 AM
Why so much care about Microsoft? Honestly, we talk about OpenGL here, please stop comments like "dx10 have...", "windows have..." and so on.

We should do something more constructive to be able to say "OpenGL3 have ..."

I hope Khronos will handle with care the next OpenGL3 as many of us have many expectations from it.

Instead flaming about OpenGL3 vs DX10, we better try to ask the peoples from Khronos.

And maybe, Khronos and ARB peoples read this thread and figure out all the problems they need to solve.

From my point of view, I will be very happy if I could have some even some alpha builds to play with.

First of all, lets see what OpenGL3 is in reality then start comparing, until then, stop spaming.

V-man
05-25-2008, 04:01 PM
Why so much care about Microsoft? Honestly, we talk about OpenGL here, please stop comments like "dx10 have...", "windows have..." and so on.


It's difficult to ignore MS. They output so much software and technology. They buy out companies quite a lot. Basically, they are a IP collector. They attract a lot of interest from newbies doing some PACMAN with XNA and c# to university grads.

Mars_999
05-25-2008, 06:04 PM
Hey V-man, you are correct, you can't ignore MS. They are the largest and have the most influence on what happens.

And as for dor00, what else do you need for constructive talk, GL right now has DX10 features on WinXP if you have Nvidia hardware. Isn't that constructive?

Zengar
05-25-2008, 10:17 PM
Why so much care about Microsoft? Honestly, we talk about OpenGL here, please stop comments like "dx10 have...", "windows have..." and so on.


It's difficult to ignore MS. They output so much software and technology. They buy out companies quite a lot. Basically, they are a IP collector. They attract a lot of interest from newbies doing some PACMAN with XNA and c# to university grads.

This is true. XNA is obvoiously the tool of choice for new programmers. I know lots of fellow students who are interested in game programming and start with XNA; because it is easy.

Hampel
05-26-2008, 12:22 AM
But I haven't seen a convincing example of a 3D application/game based on C# and XNA... Are there any?

dor00
05-26-2008, 01:49 AM
But I haven't seen a convincing example of a 3D application/game based on C# and XNA... Are there any?

I am too old school c/c++, I dont give a &*&* about C#. Same as for XNA.


Hey V-man, you are correct, you can't ignore MS. They are the largest and have the most influence on what happens.

And as for dor00, what else do you need for constructive talk, GL right now has DX10 features on WinXP if you have Nvidia hardware. Isn't that constructive?

Well, is true, one of powers of OpenGL come true extensions.

About MS, one of the reason why DX is so popular is because was updated very often, I think. Windows carry and old opengl.dll.. not updated. At least, imho, with Khronos, peoples will have a trust ( I hope ) that the OpenGL development and improvements continue.

As for wishes from OpenGL3 I really have some, but I dont know where to send my suggestions.

OpenGL is a great API, I hope Khronos will handle with care that.

jcornwall
05-26-2008, 04:44 AM
I am too old school c/c++, I dont give a &*&* about C#. Same as for XNA.
You should care about C#. Both it and other managed languages represent the future of software development, with greatly increased productivity at the cost of performance; but hardware is cheap and people are not. There are some very good arguments for (and substantial research into) how JIT'd languages will one day exceed the performance of statically compiled languages by taking advantage of runtime context.

Managed languages are relevant today even without exceptional performance. APIs such as OpenGL allow the programmer to claw back a lot of application performance without substantially reducing development productivity. A discussion of the viability of 3D development in those languages is very much relevant to this forum and to OpenGL 3.0. (Particularly as Microsoft has already done it with Direct3D 10 and XNA.)

dor00
05-26-2008, 05:09 AM
I am too old school c/c++, I dont give a &*&* about C#. Same as for XNA.
You should care about C#. Both it and other managed languages represent the future of software development, with greatly increased productivity at the cost of performance; but hardware is cheap and people are not. There are some very good arguments for (and substantial research into) how JIT'd languages will one day exceed the performance of statically compiled languages by taking advantage of runtime context.

Managed languages are relevant today even without exceptional performance. APIs such as OpenGL allow the programmer to claw back a lot of application performance without substantially reducing development productivity. A discussion of the viability of 3D development in those languages is very much relevant to this forum and to OpenGL 3.0. (Particularly as Microsoft has already done it with Direct3D 10 and XNA.)

You registered on site just to reply to that? Haha... I lol IRL now!:)

Cut the crap please, go spread your opinions some where else. Try to fool newbies.

V-man
05-26-2008, 06:06 AM
But I haven't seen a convincing example of a 3D application/game based on C# and XNA... Are there any?

I hang around at gamedev.net and I have seen a couple of interesting things although in one of them the guy said he was used the models and shaders in the samples. It comes down to the same old problem, you need a team to make a exciting game. There are exceptions where a 1 person team makes something visually exciting but it is sort of close to being considered PACMANish.

About c#, Java and so on. I haven't yet heard of any high end game using them and it's normal because it would seem as if they are limited to basic x86 instructions. Does anyone know if the new Java compiler, the VS2008 compiler can generate SSE code?

Zengar
05-26-2008, 06:39 AM
I am too old school c/c++, I dont give a &*&* about C#. Same as for XNA.
You should care about C#. Both it and other managed languages represent the future of software development, with greatly increased productivity at the cost of performance; but hardware is cheap and people are not. There are some very good arguments for (and substantial research into) how JIT'd languages will one day exceed the performance of statically compiled languages by taking advantage of runtime context.

Managed languages are relevant today even without exceptional performance. APIs such as OpenGL allow the programmer to claw back a lot of application performance without substantially reducing development productivity. A discussion of the viability of 3D development in those languages is very much relevant to this forum and to OpenGL 3.0. (Particularly as Microsoft has already done it with Direct3D 10 and XNA.)

You registered on site just to reply to that? Haha... I lol IRL now!:)

Cut the crap please, go spread your opinions some where else. Try to fool newbies.

Well, I am registered for some years here and I fully agree with jcornwall. Even now c# and java are about 80% of C's performance and given the fact that you program faster and make less error with them, it's not as bad, is it?

p.s. I tested the latest Java update on Mac, and I have to say, "wow". They really improved the JIT! The server version now runs faster then the C (-O3) version on all tests except FFT.

Zengar
05-26-2008, 06:44 AM
About c#, Java and so on. I haven't yet heard of any high end game using them and it's normal because it would seem as if they are limited to basic x86 instructions. Does anyone know if the new Java compiler, the VS2008 compiler can generate SSE code?

The SUN JIT generates SSE code where it seems fit. Still, performance-hungry parts are best written in a low-level language like C of course. But that is where you can use native-code libraries with your Java app. With c#, it is much easier, as calling cost of native code from CLR is much lower. Still, all current CLR's are rather slow -- if MS would make a JIT as good as Sun's, it will surely beat all native compilers and Java by a large margin, because CLR has more room for optimization (native generics unlike Java and struct types). But alas, .NET still isn't even unrolling loops in most cases, much less using advanced optimizations :p

Brolingstanz
05-26-2008, 07:02 AM
C# is awesome. Don't believe it? Try it. If it weren't for some low level memory and instruction stuff I need to do, I'd be using it exclusively. Only way to fly for modern GUIs on Windows as far as I'm concerned.

There's always a mix, C++/CLI, which lets you intertwine C++ with managed CLR... best and worst of both worlds, but a pretty nifty way to wrap up those hard to wrap libraries and such.

I'll bet load speed is favored over lengthy optimized compiles, but that's sure to improve in the future.

And yes, the value/reference type distinction was a smart move indeed.

dor00
05-26-2008, 08:22 AM
Well, I am registered for some years here and I fully agree with jcornwall. Even now c# and java are about 80% of C's performance and given the fact that you program faster and make less error with them, it's not as bad, is it?

p.s. I tested the latest Java update on Mac, and I have to say, "wow". They really improved the JIT! The server version now runs faster then the C (-O3) version on all tests except FFT.

C# is around for a good chunk of time. Same Java, even more.

Now, please, point 10 commercial applications who are made integrally with them. Just point 10. No open source.

I wait the reply.

Zengar
05-26-2008, 08:34 AM
google is your friend...

Demirug
05-26-2008, 09:34 AM
I think one of the relevant C# applications in this context here is FXComposer.

knackered
05-26-2008, 09:46 AM
Its a fair question, and I take no offense to any of the comments made here in frustration. Come to the OpenGL BoF at SIGGRAPH, and you will find the answers you seek.
Good lord, Michael Gold!
I somehow feel much more optimistic now we've had an official wink from nvidia.

Nicolai de Haan Brøgger
05-26-2008, 09:47 AM
I think XNA is one of the best strategic decisions that Microsoft has taken in years. The future is in managed languages and there's no question that Zengar and Jcornwall are right about some of the reasons why. Lower cost of developerment, maintainable, stable and secure applications are worth paying a few percent of performance for. Software is getting more and more complicated and we need more levels of abstraction to manage it and leverage multiprogramming properly. I think most developers, that works on something just a little more complicated than tech demos, already realize this.

knackered
05-26-2008, 11:08 AM
I remember this kind of talk when java came out. If you're competing in the market place based on speed of execution (most renderers are), then choosing a language where you can get the best optimisations is obviously the right way to go, regardless of how fast CPU's/GPU's are getting. A few percent is all it takes to miss a vsync and make the user experience unpleasant.
It's all relative to your competitors. Of course, eventually it flips over to diminishing returns, but we're definitely not at that stage yet. We're all CPU bound.

Demirug
05-26-2008, 11:30 AM
Most games today waste millions of cycles by using an interpreted script language. We although sacrifice performances for flexibility on all other places.

This whole discussion has happens before as game development switches from assembler to C and then from C to C++. IMHO we will see another switch in the near future as C++ reached its limits.

Zengar
05-26-2008, 11:32 AM
Noone is going to write a "high-end" game (like Crysis) using Java, this is clear. It lacks the special (for your app) optimization potential, as you can't mess with low-level stuff. But modern VM do pretty nice optimization job for "normal" applications; they determine the memory usage patterns, the behaviour of your code, and optimize for it automatically --- without the extra programmer work. That is the nice thing about them. Therefore, they are a good choice for "simpler" games that don't need that extra performance your computer can provide. I am sure that lots of people will buy a cheap game that may not have latest graphics but is interesting to play.

MZ
05-26-2008, 01:00 PM
Most games today waste millions of cycles by using an interpreted script language. We although sacrifice performances for flexibility on all other places.

This whole discussion has happens before as game development switches from assembler to C and then from C to C++. IMHO we will see another switch in the near future as C++ reached its limits.
I agree with your prophecy about C++. However I wouldn't say that "the language after C++" is going to come from the "interpreted script language" camp.

As for the C# in particular, it's an overrated language, and I don't believe its career in the game industry will be significiant at all. C# existence is direct result of Microsoft's Java envy; the language is designed by author of Delphi; and it may be cool for VB replacement, not C++.

Neither of mentioned languages represents pinnacle of PL research. To be able to replace C++ we need something totally out of this league, and I'm not meaning this as a flattery for C++.

davej
05-26-2008, 01:05 PM
C# is around for a good chunk of time. Same Java, even more.

Now, please, point 10 commercial applications who are made integrally with them. Just point 10. No open source.

I wait the reply.
Your question is a bit like saying "Name me 10 commercial Cobol applications." JVM and CLR based languages are, currently, mainly used for in-house stuff.

Why no open source? Eclipse is open source and is hardly somebody's toy project. It also has commercial applications built around it - IBM's Rational Application Developer, the latest Lotus Notes client, etc.

dor00
05-26-2008, 01:13 PM
C# is around for a good chunk of time. Same Java, even more.

Now, please, point 10 commercial applications who are made integrally with them. Just point 10. No open source.

I wait the reply.
Your question is a bit like saying "Name me 10 commercial Cobol applications." JVM and CLR based languages are, currently, mainly used for in-house stuff.

Why no open source? Eclipse is open source and is hardly somebody's toy project. It also has commercial applications built around it - IBM's Rational Application Developer, the latest Lotus Notes client, etc.

No open source, commercial applications, I have a hidden idea here.

Is fun how some peoples take C# in account and support it so much, and they CANT name 10 C# commercial applications... guess why?...

pudman
05-26-2008, 01:34 PM
Let's pull this all back into perspective.

The only reason we're talking languages is because dor00 proclaimed distaste for anything but c++. Fine. However, I find language of choice largely irrelevant when talking about OpenGL. It's accessible through all of the languages mentioned so far, by some means.

Just as language tends to be a personal preference, effective speed of ones OpenGL program is also a personal preference. Sure, we all like FAST but it's not always a requirement. That applies to languages as well. Do I need the FASTEST possible execution? Well, it depends on the domain.

So please, we all know how great language X is compared to Y. Yay.

Jan
05-26-2008, 01:52 PM
I like C++. Why? Don't know, maybe because it is the only language i use, and people tend to like things they are familiar with. I haven't tried C#, yet. Why? Because it is (currently) insignificant for games. Also, because some APIs and tools are written with C++ (not C) and thus difficult to interface from languages other than C++.

I would love to use other languages than C++ (like D or maybe even C#), because C++ has many deficiencies, but as long as there is no replacement with significant momentum (ie. support by MANY vendors), C++ is the only real option.

So this is a chicken-egg problem. I don't want to use another language, because only few people use other languages than C++ for games, but as long as people decide like me, the situation won't change.

Therefore, although i think, that C++ is not really the great language anymore, that it was 10 or 20 years ago, i believe it will stick around for decades (literally) and i am not so sure whether C# will replace it or something completely different/new. Especially looking at multi-core programming it might mean that completely new languages come up. Functional programming seems to become much more popular as one way to "solve" parallel programming.

I don't think the used languages will change dramatically in the near future. I only fear, that there won't be a de facto standard, as we have/had it with C/C++, during the time that people search for better languages.

Jan.

jcornwall
05-27-2008, 03:05 AM
Especially looking at multi-core programming it might mean that completely new languages come up. Functional programming seems to become much more popular as one way to "solve" parallel programming.
This is already beginning to happen, but functional languages represent a large deviation from the norm that most programmers are not keen on. I'm not even convinced they are suitable for large-scale application designs but that's another issue.

Far more successful are the hybrid approaches that embed functional concepts into imperative languages. Intel's Thread Building Blocks is a particularly good example, using C++ constructs (and assisting libraries) to break up programs into functionally parallel tasks.

(Of course, now we're really off-topic so I won't continue down this road!)

Nicolai de Haan Brøgger
05-27-2008, 03:26 AM
In any case, the more object-oriented approach to the GL API seems to be a step in the right directions :)

Khronos_webmaster
05-27-2008, 04:53 AM
There are some stray postings occuring in this thread that have gotten a bit inflamed. This is a family board, and we like keeping our threads on topic. If you wish to discuss C, C#, XNA and how it relates to 3D Graphics and OpenGL, please move it to a new thread. Let's move back on topic of OpenGL 3 Updates.

Thank you all for your co-operation and for keeping it clean.

Mark Shaxted
05-27-2008, 05:49 AM
Let's move back on topic of OpenGL 3 Updates.

What updates? ;-)

bertgp
05-27-2008, 07:04 AM
@ Khronos_webmaster : Where are you in Montreal? Maybe I could swing by and have some questions answered? ;)

knackered
05-27-2008, 07:34 AM
Did I miss an update on GL3?
I don't see how that's possible, I've been constantly hitting F5 since 2006.

Mars_999
05-27-2008, 10:00 AM
I think part of the reason why this is taking so long for GL3.0 to come out is, with GLSL the way they call each shader type void main(void) is causing a head ache vs. void mainVS(void), void mainFS(void), void mainGS(void). They would like to make a .fx file like HLSL uses name it .glx or whatever you like. I am assuming they will have to break some legacy code to do this?

Korval
05-27-2008, 01:26 PM
Come to the OpenGL BoF at SIGGRAPH, and you will find the answers you seek.

Oh, I'll find the answers I seek? Somehow, I doubt that.

Because I don't want to know what GL 3 is like anymore. I've accepted that it will be a marketplace failure. It doesn't matter how good the API is, or how easy to write drivers it will be. ATi and Intel will not support it in any significant way. They have proven that they do not care to provide any meaningful support for OpenGL (crappy drivers are not meaningful support). Without the support of 2/3rds of the video hardware out there, GL 3 will die. Apple and Linux may use the API, but only because they write drivers explicitly for it. Or because Apple can make their IHVs write decent drivers for the API.

But as far as the Windows ecosystem is concerned, OpenGL 3.0 will be stillborn. I've come to terms with this fact (which is one reason I haven't been active on these boards recently). If you want to write stable applications that use 3D graphics on Windows, applications that can be run on multiple systems and hardware without much concern for breakage, be prepared to use Direct3D. End of story.

What I want is an explanation. I want to know why GL 3.0 was promised in September of last year and not delivered. And I mean a detailed explanation too, not a "we wanted to make it better." I want to know how a group of reasonably intelligent people can completely and utterly fail to develop a rendering API in 12 months. I want to know why no explanation for the delay has been given more than 8 months after the initial delay. I want to know why no progress updates have been given almost a year after the previous updates.

The OpenGL community deserves an explanation. This API has been mishandled too often too frequently for us not to get one. First, it was the original GL 2.0, which was the rewrite that we're getting now. If we'd just taken the hit back then, we wouldn't have a problem now. But no, we decided to drop that. Next, it was VBO; there was no excuse for the ARB not to provide some mechanism like this earlier (and it still has problems). Then, it was FBO; the ARB spent a year on a design that they discarded in favor of FBO, which took another year to crank out. And it still isn't widely supported. And now this rewrite, which is basically 6 years late, but 8 months in more forgiving terms.

The OpenGL BoF needs to address these points. It's that simple. The OpenGL community deserves to know why the ARB has so mishandled and mismanaged this API.

knackered
05-27-2008, 02:25 PM
The OpenGL community deserves to know why the ARB has so mishandled and mismanaged this API.
...or in other words, why they buried this API. It stinks of Microsoft dirty tactics.

bobvodka
05-27-2008, 02:27 PM
I agree firmly with Korval, there isnt' really much more I can add to it.

I also wanted to thank everyone for the 'lulz' regarding C#, XNA and all manner of garbage which got rattled off in the last few pages; I needed something amusing to read when I got back from my weekend away :)

Lord crc
05-27-2008, 03:35 PM
It stinks of Microsoft dirty tactics.

From what I can gather, Khronos doesn't have a clear "command hierarchy" (inside the Promoter group), and as such I fear that they've managed to drop the ball all by themselves. I've seen this happen in other "flat" democratic communities.

Korval
05-27-2008, 07:51 PM
It stinks of Microsoft dirty tactics.

I don't see how you can play the "Microsoft did it" card on GL 3.0. By what power could Microsoft affect the ARB? They're not even on it anymore. They might have screwed up GL 2.0's rewrite (and I freely admit that 3D Labs's initial proposal needed work) and maybe VBO. But by the time that render-to-texture was under discussion, Microsoft had left the ARB.

The only possible leverage they might have is some IP claim. But what? Glslang is sufficiently different from any of Microsoft's shading languages to make any such pressure invalid.


From what I can gather, Khronos doesn't have a clear "command hierarchy" (inside the Promoter group), and as such I fear that they've managed to drop the ball all by themselves. I've seen this happen in other "flat" democratic communities.

I'm not sure what the Promoter group could have to do with the GL ARB. I don't know what the Khronos hierarchy is like, so I don't understand the reference. As far as I understood it, each Khronos working group is on their own separate timetable.

BTW, I bet you that C++0x will have a full draft specification before GL 3.0 ships. Allow me to explain in some detail how stupid that makes the ARB.

In the time it took the OpenGL ARB (who supposedly have phonecall meetings 5 times a week) to make a graphics API, a council that meets 3 times a year will have sifted through thousands of proposals for C++ language and library extensions, pruning the weak and enhancing the strong. They will have had to merge incompatible specifications, resolve how one specification influences another, revise all of their old library components to properly use the new features, add all new library components that properly use the new features, and so forth.

In the same time it took the GL ARB to make a graphics API. At this point, the only explanation besides, "We're all incompetent" is, "We wrote two new low level and high level shading language from scratch." And even that one's pushing things.

Brolingstanz
05-27-2008, 08:28 PM
I think Lord_crc's point about lack of a clear leader in the decision making process may be a good one. If say Nvidia had been given the reins from the get go they'd likely have driven this thing into the end zone pretty quickly, and in a form I think most of us would be very happy with.

There were some allusions to "contention" among Khronos members earlier in the thread that support that line of reasoning.

P.S. I know many folks have been waiting a long time for a major revision, but in all fairness is 1 year that long to wait from the date of inception? I'm on the edge of my seat like everyone else, but let's be reasonable ... even DX revisions take a year or longer.

RenderBuffer
05-27-2008, 08:39 PM
I don't think there is any reason to be so negative. The ARB and its members are not incompetent, and if I were a member of the ARB I'd be rather upset the suggestion.

There is a sense of deservedness that is misplaced, and I think we should be grateful that there are competent people who are willing to spend their time on OpenGL 3.0. If things take more time than originally expected, that cost the ARB members their time, but the rest of us were able to continue on using the existing API. If GL3 were cancelled we might all be disappointed, but there no one here *deserves* anything.

Let's try to be a bit more positive.

Thanks to the ARB,
Andrew.

Toni
05-27-2008, 11:08 PM
I don't think they're incompetent as that, just they didn't thought about the community that has been (ant it's still) waiting an API for more than a year.
Just saying "it will take a lot more than expected because whatever" and we would be relatively happy.
Anyway, i think there haven't been any major breaking tech since NV extensions for geometry shaders and such, am i right?
So in some sense regarding NV graphic cards... we're more or less on par with dx 10, excepting of course we would like a cleaner api :)
And since Ati and Intel think that ogl isn't important enough to implement such extensions or even promote them to EXT... well here we are, we can't use dx10 features in opengl in a general way :P

It's funny in some perverse sense... because in linux you can only rely on NV to do 3d, and the same applies to ogl in windows :) amazing.

So yes, thanks to the ARB but don't wayt till siggraph, say something dudes! :)

Toni

dor00
05-27-2008, 11:56 PM
Does Khronos totally control ARB now?

In other words, Khronos is totally responsible for the future of OpenGL?

Korval
05-28-2008, 12:59 AM
If say Nvidia had been given the reins from the get go they'd likely have driven this thing into the end zone pretty quickly, and in a form I think most of us would be very happy with.

To be fair, if nVidia had been given the reigns from day one, they'd have made an API that tries its best to screw ATi and Intel hardware as much as possible. nVidia looks out for itself first, last, and always.

We may have liked the form of the API, but we wouldn't necessarily like the implementations on non-nVidia IHVs.


I know many folks have been waiting a long time for a major revision, but in all fairness is 1 year that long to wait from the date of inception? I'm on the edge of my seat like everyone else, but let's be reasonable ... even DX revisions take a year or longer.

No, 1 year from inception isn't too long. But that's the problem; it's well over 2 years now since the first peep about what would become OpenGL 3.0 was revealed. ATi and nVidia were collaborating on the beginnings of a rewritten version of OpenGL. The first newsletters, detailing the evolution of GL 3.0 started over 18 months ago. September 2007 was not the inception of GL 3.0; it was when we were promised to have a finished GL 3.0.


The ARB and its members are not incompetent

When I see competence displayed by the ARB, I will accept that they are competent. Competence, in this case, being that you release what you said you were going to when you were going to.


the rest of us were able to continue on using the existing API.

The rest of who? OpenGL is a joke on Windows. Sure, nVidia provides reasonable support (but even their drivers have strange quirks), but reliance upon ATi and Intel drivers is simply stupid. Unless you're using the Doom3 engine or some derivative thereof, (and thus IHVs must implement at least enough GL to run your engine) you cannot rely on OpenGL support on Windows. It's as simple as that.

The most important aspect of GL 3.0 was the possibility of better drivers. By making the API simpler to implement, driver quality would improve. Of course, you can forget about that now; if the ARB can't get a simple spec out after a year of development, there's no way we can rely on the IHVs from the ARB to write a decent GL 3.0 implementation. They just don't care. Nobody will be using it, so there's nothing to be gained from writing one.

I'm sure OpenGL 3.0 will be a nice API, but it will be little more than a piece of paper. It will not be a functioning 3D API that you could reasonably rely on in Windows.


Let's try to be a bit more positive.

Positive? I was positive 6 months ago, when this thread was first started. Since then, I have seen not one shred of evidence that any form of progress is being made. And considering the history of the ARB (as I previously explained), being positive at this point is just naivete.

dor00
05-28-2008, 03:29 AM
Korval you have right.

Your theory can only fail if Khronos want to make a B_I_G surprise. Translate B_I_G as BIG BIG stuff and improvements and updates and whatever you want. Else i cant see why this silence around OpenGL 3.

bobvodka
05-28-2008, 05:20 AM
imo a big surprise does NOT make up for nearly a year of silence once SIGGRAPH rolls around.

Sept last year we were told 'nearly done', start of this thread it was 'a few issues to iron out'.

Frankly, unless the big surprise is 'NV, AMD and Intel all have GL3.0 drivers shipping _NOW_ which conform to a reference impliementation' I will be less than impressed. And by 'now' I mean 'drivers released the same day for XP, Vista, Linux and OS X' in both 32bit and 64bit variants for all hardware which supports GL3.0

Given the already stated levels of competance in driver creation displayed by AMD and Intel when it comes to OpenGL I'll be surprised if this is the case..

speedy
05-28-2008, 08:15 AM
The rest of who? OpenGL is a joke on Windows. Sure, nVidia provides reasonable support (but even their drivers have strange quirks), but reliance upon ATi and Intel drivers is simply stupid.

Korval, released games like "Penumbra: The Black Plague", "Quake Wars", "Doom 3", "The Chronicles of Riddick", "Perry Rhodan: The Immortals Of Terra" and knowing that ID tech 5 and (at least) one big UK developer use OpenGL does away with your claim easily.

Jan
05-28-2008, 08:51 AM
ID tech 5 uses OpenGL only for Linux and Mac. Right now, the windows-version is D3D9 based.

The reason is, that on Linux and Mac there is no other choice. On windows, where you have the choice, id chose D3D9 over OpenGL. And of course for the XBox D3D9 was needed, too.

Knowing, that id was always a strong supporter for OpenGL, it is quite disturbing to see, that even they changed their preferences. If GL 3 is out in time and is indeed a good API, i am sure they will change the Windows-version to use that instead. Only reason not to do so, would be not to have to maintain a fourth rendering-backend (Linux, Mac: GL 2.1, xBox: D3D9, PS3: whatever, Winodws: GL3). So only, if it will also be available for Linux and Mac, in time, they might actually use GL 3.

Jan.

Jan
05-28-2008, 08:57 AM
Oh, and btw: "Penumbra: The Black Plague" and "Perry Rhodan: The Immortals Of Terra" are adventure games, which usually have much less demands in graphics.

"Doom 3" and "The Chronicles of Riddick" are OLD. "Quake Wars" is still Doom 3 engine based and has not really compelling graphics, compared to what is possible.

Other than your list, i don't remember many more games, that use OpenGL, so "does away with your claim easily" is not convincing. Especially not comparing, that there are 60+ games in development, that are Unreal Engine 3 based ALONE (so all D3D).

OpenGL 2.1 is definitely not an API used much by games.

Jan.

NeARAZ
05-28-2008, 09:39 AM
The rest of who? OpenGL is a joke on Windows.
+1

The single most important point for an API to get right: it needs to work.

Everything else: whether the design is nice, whether it changes with each hardware generation or not, whether it uses C or COM objects or C++ classes or AJAX ;) , and so on and so on are only secondary. If API and everything that comes with it (in OpenGL's case, drivers) works, then everything else can be solved.

OpenGL has failed miserably in "works" area on Windows. Some of that can be blamed on the complexities of the API that has evolved over a decade+, some of that can be put onto drivers that have to be fully written by IHVs, some of that is because Direct3D just advanced further and people switched over that, so IHVs did not put that much effort into OpenGL, and so on. But the reality is that Direct3D on Windows is far, FAR more stable than OpenGL on Windows.

If OpenGL 3 would be focused on just two things: 1) use a common runtime everywhere (GLSL parser and basic optimizer is probably the most complex piece) and 2) get rid of all complexities, make a simple API instead. Maybe the drivers could have improved. Coupled with the fact that DX10 is tied to Vista, it might have had a chance... But then GL3 should have been out two years ago, because right now it's too late already.

None of the above matters that much though. OpenGL is still the only way to go on OS X and Linux - fine. Direct3D (9 or 10) is much better on Windows - fine as well. Consoles have their own APIs or much lower level access to hardware - fine. Use whatever API is suitable for the job, not a big deal. The "single API on all platforms" is a myth anyway. And rewriting the codebase to a new API (when it targets similar hardware) is quite easy (yes, I have done that).

Ok, enough of incoherent rambling.

ZbuffeR
05-28-2008, 11:59 AM
ID tech 5 uses OpenGL only for Linux and Mac. Right now, the windows-version is D3D9 based.
Your reference please ?
I am sure sure it was GL on all computer platforms, even if it was mentionned to possibility to go D3D on windows in the future.

Jan
05-28-2008, 12:23 PM
http://www.beyond3d.com/content/news/462

That's the article i could easily find through google. I admit that it only hints at it, and the golem.de article is actually even less definite about it (i think beyond3d interpreted a bit more into it, than was actually said), but i remember having read more recent comments by John Carmack, where he actually said, that the current (!) windows-version uses D3D9 (it is nothing said about how it will be, when the game is released). Though i can't find these comments right now.

Jan.

Korval
05-28-2008, 01:52 PM
use a common runtime everywhere (GLSL parser and basic optimizer is probably the most complex piece)

And just who is going to write and maintain it?

Apple's GL implementation does this very thing. But then, it is Apple who writes and maintains it. Microsoft certainly has no incentive to do so.

Khronos does not have any significant programming staff. And it would most assuredly require a significant programming staff to write and maintain a common runtime environment.

ZbuffeR
05-28-2008, 01:55 PM
This crappy beyond3D article misunderstood the "D3D9-class hardware" as API.
If you have first hand info, please share, otherwise it is not worth it :)

knackered
05-28-2008, 03:03 PM
They don't really need a common runtime if the new API is as simple and clean as it looked like it was going to be. Get rid of all that format conversion bollocks and it's easy (being able to submit any old data types to the likes of teximage etc, and expect the driver to convert it for you). Most of the state-spaghetti madness is gone because it's all moved to shaders.
And for the love of god, just allow any shader language to be used with GL3 - drop GLSL. If you remove the built-in uniforms, it's totally API independent. Then we get the same shader compiler reliability d3d users get.
All I want is something as clean as D3d10, but with quad buffered stereo and swap locking, and that runs on XP, Linux, OSX and possibly Vista. And I want it now-ish. Is that really too much to ask of Khronos?

MikeC
05-28-2008, 03:03 PM
I don't see how you can play the "Microsoft did it" card on GL 3.0. By what power could Microsoft affect the ARB? (...) The only possible leverage they might have is some IP claim.

I dunno about that. One of the more subtle consequences of D3D having no extension mechanism is that Microsoft get to play gatekeeper for anything the IHVs come up with. If FooCorp implements whizzy feature X while BarCorp implements whizzy feature Y, and FooCorp isn't on Microsoft's Christmas card list for some reason, it's entirely possible that D3D vN+1 just so happens to expose Y but not X, and FooCorp is relegated to the bargain bin for a generation. That's a fair amount of potential leverage, if you were trying to persuade someone to drag their feet a bit.

No idea whether this is actually happening; this is just wild speculation. But it wouldn't surprise me. With Vista/DX10 flopping, I can imagine MS getting really quite paranoid about a GL3 threat to their lock on PC gaming.

Korval
05-28-2008, 04:16 PM
This crappy beyond3D article misunderstood the "D3D9-class hardware" as API.
If you have first hand info, please share, otherwise it is not worth it

Even if we assume that Idtech5 is going to be OpenGL only, that's just one engine. And with Id being increasingly marginalized with regard to engines (Unreal 3's Engine is much more prevalent), that one becomes less and less meaningful as time goes on.


Get rid of all that format conversion bollocks and it's easy

Well, you need some format conversion, since you don't want to make the API so inflexible that IHVs can't optimize things they need to. I mean, if you ask for an LA8, and the hardware doesn't support it, it's better for the API to silently give you an RGBA8 and do the conversion. After all, a working program is better than a not working one.

More importantly, format conversion is really the least of GL's implementation problems. It's not exactly the low hanging fruit.


And for the love of god, just allow any shader language to be used with GL3

And how do you do that? The point of a specification is to specify behavior. You can't have a gigantic hole in your spec that says, "And then the vertex processor does... something. Based on some value you put here, but we can't say how that works."

If you don't have at least one shading language as a base part of the spec, then you're basically saying, "pass in some string and hope for the best." Which isn't actually solving any problem, since we have no idea what that string should be.


With Vista/DX10 flopping, I can imagine MS getting really quite paranoid about a GL3 threat to their lock on PC gaming.

Why? Even if GL 3 came out at the perfect time (last summer was probably the height of the Vista/DX10 hate. People become more accepting with time) with great driver support, it would hurt DX's standing only marginally. You wouldn't get a wholesale industry shift to GL 3.0. Certainly some would, but others wouldn't, simply due to skepticism over GL 3.0 driver quality. The legacy of GL 1.x and 2.x takes a long time to work out. And it'd still take a year before any GL 3.0 compatible games came out.

Mars_999
05-28-2008, 05:18 PM
You are wrong Jan, John just stated that the windows version will be OpenGL based on 2.0 or 2.1 and he may target GL3.0 depending on the status he said. With GL he covers Windows, OsX, Linux, PS3 in one swoop. He is saying that DX9 class hardware will run it, not that DX9 is the API.

speedy
05-28-2008, 05:28 PM
-1 on OpenGL drivers are a joke.

Usage patterns of Quake Wars are currently on par with 90% of the most recent D3D 9 A titles.

My point is that they can get the job done, as I found out from personal experience. And are very stable on *very* wide range of hw. Calling them a joke is overly harsh.

The biggest problem in them is that obscure and very rarely used features can be often too buggy for open-market deployment, as OpenGL is more widely focused. But that will be cleaned up in OGL 3.

Korval
05-28-2008, 07:07 PM
Usage patterns of Quake Wars are currently on par with 90% of the most recent D3D 9 A titles.

I'm not sure what you mean by that. It sounds like you're saying, "If you're able to look at Quake Wars's rendering code and replicate it's rendering pattern for your game, then you can use GL 2.1 just fine," then yes, you are correct.

Of course, Quake Wars's source code is not available. Nor are most games Quake Wars, so they will by necessity have different rendering patterns. Expecting everyone's engine to mirror what Quake Wars does does not make for a functional API.


And are very stable on *very* wide range of hw.

Really? I'm no fan of Blender3D myself, but it's a functioning GL application... on nVidia cards. Go ahead and try to use it on ATi hardware. Drivers that match the GL spec will run it; drivers that match only what Quake Wars does and ignores everything else will not.


obscure and very rarely used features

Like glslang; nVidia and ATi cannot agree on what is proper glslang code. nVidia violates the standard whenever they feel the need, yet ATi drivers break on valid code intermittently.

Or reasonable FBO extensions like the depth+stencil buffers or the multisample extensions, which ATi still doesn't support no matter how long they have been out.

NeARAZ
05-28-2008, 09:38 PM
use a common runtime everywhere (GLSL parser and basic optimizer is probably the most complex piece)
And just who is going to write and maintain it?
Get a programming staff then. Assemble a team from IHVs, make it open source, leave it up to the community, branch from Mesa - whatever, the point is that most of hard parts should be written once.

Because the current situation, where everyone gets their own chance to interpret the spec and to make their own bugs, is just horrible. Even on a platform like OS X - where Apple is probably the only commercial company that needs OpenGL - they still have issues with GLSL parsing (parsing!). Array initializers, I am looking at you.

Use one and only one implementation of GLSL preprocessor, parser, syntax tree generator, basic dead code elimination, basic nop removal, basic common subexpression elimination. The syntax tree can be somewhat high level (e.g. native matrix multiply instructions), but basic stuff has to have single implementation. Again, right now even on Apple they sometimes don't optimize (x*1.0) to (x), or sometimes shader creation fails because of some unused function that does not fit into some limits (but parses correctly... but the function is unused!).


Khronos does not have any significant programming staff. And it would most assuredly require a significant programming staff to write and maintain a common runtime environment.
Again, then get it. Assemble it. Whatever works; just doing the spec and letting everyone implement it is a recipe for the current situation (i.e. disaster).

The only other semi-sane alternative would be to write tests for everything in the spec, up to some quite complex tests. And then make the test suite public, and make it required for any OpenGL implementation to pass everything. Otherwise they can't call themselves OpenGL.

NeARAZ
05-28-2008, 09:47 PM
-1 on OpenGL drivers are a joke.

Usage patterns of Quake Wars are currently on par with 90% of the most recent D3D 9 A titles.
Well, that was the point: unless you're using id tech derivative, the drivers won't care about you. Yes, the drivers cope quite well with big AAA titles that use OpenGL (which is pretty much only id tech derivatives).

There is another market though - much smaller games that are targeted at a more casual audience. Not everyone makes AAA games, you know. And that target has several problems: 1) it's dominated by low cost cards, mostly Intel. To this day Intel on Windows does not have GLSL, FBO and so on. 2) the players generally never update their drivers. Yes, that means there is no good way to "push" new OpenGL version out to them, or even to push new drivers with fixes.

Now, in the non-AAA game market, where IHVs don't specially tailor drivers to your game, D3D is much more stable. I can't stress how much more stable it is. Back when we used OpenGL in Unity, probably 95% of crashes were coming from inside GL drivers. Now that we have D3D9 backend, driver or D3D runtime crashes are quite rare, we more often have PhysX or Mono crashes. And this is crashes; the other bugs are just plain rendering bugs. So far we have 50+ workarounds for broken drivers on OpenGL, and something like 5 workarounds for broken drivers on D3D. Does that say anything?

Korval
05-29-2008, 12:23 AM
Get a programming staff then. Assemble a team from IHVs, make it open source, leave it up to the community, branch from Mesa - whatever, the point is that most of hard parts should be written once.

These things don't materialize out of thin air. Someone has to write it. Someone has to maintain it. Those people need to get paid. And where is the money coming from?

IHVs already need to write the simplified drivers for your proposed GL 3. Why would they funnel money into the front-end for this?

NeARAZ
05-29-2008, 12:59 AM
These things don't materialize out of thin air. Someone has to write it. Someone has to maintain it. Those people need to get paid. And where is the money coming from?
That I don't know :) Khronos somehow get the money for it's operations, right? (I assume some member fees or whatever) Yes, paying a programming team is new expenses, but having a committee arguing over the spec is expenses as well.

My point is that there's no point in having a perfect API that does not work in practice. So one might as well cut on arguing, and spend money on implementing common runtime. Or get "the community" involved, somehow (where does Mesa get money from?)


IHVs already need to write the simplified drivers for your proposed GL 3. Why would they funnel money into the front-end for this?
Because right now everyone has to write whole runtime. That means Apple, NVIDIA, AMD, Mesa, 3DLabs, possibly Intel someday - all they need to write the whole thing separately. By splitting the whole stack into "hardware/platform independent" (GLSL front end, possibly format conversions, ...) and "hardware dependent" parts, at least the independent part is shared.

Yes, that would need more coordination and management, but less total work. Instead of everyone writing a compiler, just put all those people on one compiler. Less bugs, and much more predictable behaviour. I know it's easier said than done, but when you have to choose between a thing that does not work and one that requires some effort but possibly does work... well.

...oh, and then while they're at it, make GLSL have a proper front-end that emits syntax tree in binary form, with dead code removed and basic optimizations done... hey, this way GL runtime could be even without a compiler - just take in binary shaders. And whoever wants can do their own offline compiler, with predictable behaviour and bugs that they can fix themselves. But that's a story for another day :)

Brolingstanz
05-29-2008, 02:10 AM
Because right now everyone has to write whole runtime.

I'm inclined to agree but let's not forget that sword cuts both ways; individual implementation allows each vendor a certain amount of competitive and creative freedom. I think as long as GL is greatly simplified we'll see a large reduction in driver bugs, without having to re-orchestrate everything form the ground up.

Don't get me wrong, I'd like to see binary blobs or the like, but I'm not so fixed on that idea that I'm unwilling to consider some alternatives. More than anything I'd like to see a base vertex offset for the draw commands... anything to put a smile back on knackered's face... ;-)

NeARAZ
05-29-2008, 02:49 AM
I'm inclined to agree but let's not forget that sword cuts both ways; individual implementation allows each vendor a certain amount of competitive and creative freedom. I think as long as GL is greatly simplified we'll see a large reduction in driver bugs, without having to re-orchestrate everything form the ground up.
Sure. Most of runtime in GL3 would be much simpler... except for GLSL parser :) And I still believe that a lexer, parser, dead code eliminator, nop eliminator and basic common code/subexpression optimizer are fairly platform and hardware independent things. Sure, leave the real optimizer and translator to hardware microcode up to IHVs, they can have their fancy competitive advantages there, if any :) But please no "creative freedom" as we have now, where in GLSL you can never be sure which subset or superset of the spec the is actually implemented.

I guess I should just shut up now :)

ZbuffeR
05-29-2008, 04:38 AM
3D labs did release a compiler frontend for GLSL, ok it was a long time ago :
http://3dshaders.com/home/index.php?option=com_weblinks&catid=13&Itemid=33

I guess none of the other vendors used it...

speedy
05-29-2008, 06:37 AM
Usage patterns of Quake Wars are currently on par with 90% of the most recent D3D 9 A titles.

I'm not sure what you mean by that.

I'm saying that ATI & Nvidia basic building blocks are stable enough for open product deployment. I managed to target the range between GeForce 3 and 8800 and was quite satisfied with the stability. ~80% (!!) of the relatively small amount of bugs were resolved by users installing latest drivers, which was written as a requirement on the box.

Intel was _not_ among the supported cards.

ps. I had more problems with Vista/ATI/Laptop combos (users had to mod the drivers;) and OpenGL unable to initialize (drivers messed up, reinstall required) compared to crashes and/or rendering bugs.

Lord crc
05-29-2008, 06:38 AM
I guess none of the other vendors used it...

Actually I think ATI uses/used it? At least the ATI driver complains in the exact same way on the exact same code as the 3dlabs one (or so my memory tells me).

Lord crc
05-29-2008, 06:43 AM
~80% (!!) of the relatively small amount of bugs were resolved by users installing latest drivers, which was written as a requirement on the box.

The latest drivers for my ATI x700 mobile hasn't been updated since I bought the laptop, over 2 years ago. I haven't found any bugs in the drivers so far, but I think that's more due to the fact that they don't expose anything fancy in GL at all... And updating is not an option, since ATI won't provide drivers directly for laptops... Yay!

Jan
05-29-2008, 09:14 AM
Lord crc: Take a look at DhMod Tool, it allows you to easily install any ATI desktop driver on your laptop. I used it for several years, to be able to use up to date drivers.

Jan.

santyhamer
05-29-2008, 10:36 AM
Hmmmm......
whois http://opengl3.org

Domain Name:OPENGL3.ORG
Created On:30-Oct-2007 01:00:20 UTC
Last Updated On:29-Apr-2008 00:26:48 UTC
Expiration Date:30-Oct-2010 01:00:20 UTC
Registrant Organization:Khronos Group

I really hope we could get an OGL3 update at Siggraph 2008, yep!

Eosie
05-29-2008, 12:05 PM
reasonable FBO extensions like the depth+stencil buffers or the multisample extensions, which ATi still doesn't support no matter how long they have been out.
Now, ATi DOES support them, check it out...

For you guys who have been complaining about Mobility Radeon drivers - here you have a direct link to them, they are updated every month: http://ati.amd.com/online/mobilecatalyst/ (works for me)

dor00
05-29-2008, 12:27 PM
Hmmmm......
whois http://opengl3.org

Domain Name:OPENGL3.ORG
Created On:30-Oct-2007 01:00:20 UTC
Last Updated On:29-Apr-2008 00:26:48 UTC
Expiration Date:30-Oct-2010 01:00:20 UTC
Registrant Organization:Khronos Group

I really hope we could get an OGL3 update at Siggraph 2008, yep!

That become to be interesting.

Korval
05-29-2008, 01:18 PM
That I don't know \:\) Khronos somehow get the money for it's operations, right? (I assume some member fees or whatever) Yes, paying a programming team is new expenses, but having a committee arguing over the spec is expenses as well.

Not really. The ARB is made up of volunteers. They donate a portion of their time to the ARB. The expenses for meetings are fairly negligible, particularly compared to hiring staff, providing IT support, etc.

Khronos is an organization, not a company. The membership fees are not nearly enough to hire a real programming staff.


And I still believe that a lexer, parser, dead code eliminator, nop eliminator and basic common code/subexpression optimizer are fairly platform and hardware independent things.

Some of that, yes. But then they have to develop 2 language specifications. One for glslang, the other for the intermediate language that they use internally. And if you think it was hard to make glslang, imagine how hard it will be for a lower-level language that has to somehow support the above optimizations while still providing enough high-level syntax to allow for greater optimizations.


Actually I think ATI uses/used it? At least the ATI driver complains in the exact same way on the exact same code as the 3dlabs one

Yes, ATi did use it. However, nVidia decided to tweak their Cg compiler, because they didn't like glslang and wanted to force Cg on us.

Lord crc
05-29-2008, 02:20 PM
Take a look at DhMod Tool, it allows you to easily install any ATI desktop driver on your laptop. I used it for several years, to be able to use up to date drivers. Thanks for the tip, I'll check it out. However, even if it does work, I do NOT consider this a viable solution for regular users.


For you guys who have been complaining about Mobility Radeon drivers - here you have a direct link to them, they are updated every month: http://ati.amd.com/online/mobilecatalyst/ (works for me) First of all, if I go the official route, I end up with a program that says my laptop is not supported by their driver. The installer for the driver you posted simply quits and does not install itself.

Now, I guess I do understand the reason why they've done this. The annoying part is rather that their old drivers have such poor OpenGL support.


But then they have to develop 2 language specifications. One for glslang, the other for the intermediate language that they use internally.

Well if they only developed the intermediate language, then others could write the front-end... However since I don't know all the possible optimizations that can be done with some glsl code, I can't say how viable this option is.

NeARAZ
05-29-2008, 11:23 PM
Some of that, yes. But then they have to develop 2 language specifications. One for glslang, the other for the intermediate language that they use internally. And if you think it was hard to make glslang, imagine how hard it will be for a lower-level language that has to somehow support the above optimizations while still providing enough high-level syntax to allow for greater optimizations.
No one is saying it's going to be easy. But if the competition (in this case, Microsoft) can develop a spec (yes, D3D does have a spec), a working compiler, a binary format and all related tools, then it's basic laws of economics. If a competing API is better, people tend to use it (where possible). Competition should nudge OpenGL into becoming better as well.

But like I said, in my book this all of this does not really matter. Just use whatever API is best on the platform/hardware in question. So far on Windows D3D9 (or D3D10 if you're fancy) is clearly a better choice. On Mac, Linux etc. OpenGL is clearly the best (and at the same time the worst :)) choice. On consoles the proprietary APIs are the choice. On mobile platforms it's something like GL ES.

No big deal. I am just trying to think up the features that OpenGL should have if it ever wants to be a viable choice on some platforms (Windows in this case). If they won't ever have that - fine, I will happily continue using D3D. If OpenGL ever becomes better than D3D - fine, I will use OpenGL. APIs are not some religion things, they're just tools to get the job done.

Again, all this is highly subjective.

dor00
05-30-2008, 12:33 AM
However, nVidia decided to tweak their Cg compiler, because they didn't like glslang and wanted to force Cg on us.

Well, CG isnt bad at all. And i dont think they force us. CG is a great thing imho, and the fact that is made by a nVidia is good also, they know better how to made it that everybody else since they make the cards. And they cover all profiles.

ScottManDeath
05-30-2008, 12:58 AM
I think the nicest thing about CG is that it comes with an effect system. That way, it is much easier to organize shaders beyond the simple separete vertex/geometry/fragment shader files GLSL pretty much enforces. Having a single file containing shared uniforms and shared functions is very usefull.

knackered
05-30-2008, 06:52 AM
conversely, I think that's the most useless part of Cg.

-NiCo-
05-30-2008, 06:58 AM
I have to agree with knackered on this one. It's better to have them in separate files so that you can mix the files depending on what you need. On the other hand, one of the great things about Cg is the ability to set states within the effect file, such as sampler filtering/wrapping modes amongst many others.

knackered
05-30-2008, 08:11 AM
no, I meant the whole fx file system.

bobvodka
05-30-2008, 08:58 AM
I've found the fx system useful in D3D when knocking stuff together and testing thing, however beyond that I feel you'll really want to control stuff yourself.

So, like many things (that D3D has and GL doesn't) it's useful for getting going and testing things out.

Korval
05-30-2008, 10:29 AM
Well, CG isnt bad at all.

Whether CG is good or not is entirely irrelevant. The OpenGL ARB agreed on glslang. Good or bad, supporting the prevailing standard is more important than torpedoing that standard in favor of your idea of what is better.

Being a gracious loser is more important than winning.

bobvodka
05-30-2008, 11:05 AM
Well, winning is more important, but if you've lost you should just take it like a man and not whine about it...

PkK
05-30-2008, 11:07 AM
If OpenGL 3 would be focused on just two things: 1) use a common runtime everywhere (GLSL parser and basic optimizer is probably the most complex piece) and 2) get rid of all complexities, make a simple API instead. Maybe the drivers could have improved. Coupled with the fact that DX10 is tied to Vista, it might have had a chance... But then GL3 should have been out two years ago, because right now it's too late already.


C is today's portable assembler. I think it was the right choice to have GLSL at a similar abstraction level.
Drivers would treat that common runtime just as they treat GLSL input today: Spend time trying to optimize it in complex ways before ending it to the card.

Philipp

Leadwerks
05-30-2008, 11:44 AM
A visual representation of Khronos' attitude towards graphics developers:
http://static.flickr.com/53/130245294_23b05ea699_o.jpg

Korval
05-30-2008, 12:21 PM
A visual representation of Khronos' attitude towards graphics developers:

No, just non-OpenGL ES developers.

Jan
05-30-2008, 12:35 PM
And that's how this board hit a new low...

Honestly people, we are all frustrated, but could we still act like grown-ups here?!

knackered
05-30-2008, 03:45 PM
depends what you mean by acting like a grown-up. If it entails invading iran, then I'll stay being childish.
I think the image was totally justified and restrained.

pudman
05-30-2008, 06:26 PM
This thread has at least gotten a lot more interesting since Korval decided to post again.

Kronos could do to hear more rational negativity. The smartest thing they could do is respond to these concerns and frustrations. Yes, even above delivering the best GL3 ever. It's simple trust/respect.

Re Knackered:
Sure, anyone can invade Iran, but the one that can justifying himself will have support.

Korval
05-30-2008, 11:18 PM
I think the image was totally justified and restrained.

I wouldn't go that far. The image suggests malice when stupidity seems more appropriate. Don't forget Hanlon's Razor (http://en.wikipedia.org/wiki/Hanlon%27s_Razor).

More than anything, the image speaks very strongly to the OpenGL community's absolute and total frustration with the ARB's seeming inability to get anything done. Whether it's my ideas, Knackered's, NeARAZ's, or whomever's, they seem pathologically incapable of executing anything.

dor00
05-31-2008, 12:00 AM
I think the image was totally justified and restrained.

I wouldn't go that far. The image suggests malice when stupidity seems more appropriate. Don't forget Hanlon's Razor (http://en.wikipedia.org/wiki/Hanlon%27s_Razor).

More than anything, the image speaks very strongly to the OpenGL community's absolute and total frustration with the ARB's seeming inability to get anything done. Whether it's my ideas, Knackered's, NeARAZ's, or whomever's, they seem pathologically incapable of executing anything.

I really want to hope that you are wrong Korval... but if you are right, then we need a real change.

One thing is sure, OpenGL as cross platform graphics api should not die, whatever it happens and must be updated and regularly by a competent company/group/whatever.

Leadwerks
05-31-2008, 12:47 AM
It doesn't matter what the reason behind the scenes is. The net effect is of the image I posted. Whether due to incompetence or maliciousness, it matters not to me.

Neglect is passive malice.

despoke
05-31-2008, 02:49 AM
I've been following this thread for a long time now...

It's a wild idea and I have nothing to back it up, so it's my personal assumption only... But could there be some IP issues in the next GL3 spec? Which may explain why the chronos group has remained silent for so long (no update for a year), because they were told to keep things quite until it gets legally solved with whichever company holds this patent claim, etc...

MZ
05-31-2008, 04:46 AM
And that's how this board hit a new low...

Honestly people, we are all frustrated, but could we still act like grown-ups here?!

Yes we could, but what for?

I'm sure we will act as grown-ups as hell, once we get the spec. At Siggraph 2008...

Till then, the kindergarden level is acceptable (or even appropriate), I'd say.

Jan
05-31-2008, 04:47 AM
IP issues? Now THAT'S a brand new idea for explaining the delay! How come no one else would have thought of this!?

"I've been following this thread for a long time now..."

Really?!

knackered: I value your professional opinion about 3D graphics stuff, but i really don't care for your political views, please spare me those in the future.

Jan.

V-man
05-31-2008, 05:32 AM
It doesn't matter what the reason behind the scenes is. The net effect is of the image I posted. Whether due to incompetence or maliciousness, it matters not to me.

Neglect is passive malice.

It matters.
I would like to know what is the problem or problems.

More importantly, is GL3 going to come? A simple yes or no would suffice.

LogicalError
05-31-2008, 07:49 AM
But could there be some IP issues in the next GL3 spec? Which may explain why the chronos group has remained silent for so long (no update for a year), because they were told to keep things quite until it gets legally solved with whichever company holds this patent claim, etc...

maybe microsoft? ..not that i want to sound like a paranoid anti-microsofty or something ;)

pudman
05-31-2008, 09:04 AM
It's NOT Microsoft. It's annoying to keep hearing that one come up.

Anyway, if it were an IP issue that still wouldn't justify silence on the matter. They could have said "It's an IP issue that we need to resolve before more public information is released."

In my mind the only rationale for silence would be that somehow the new spec reveals some future capability that nvidia/amd don't want revealed yet. But even that is stupid because GL3 was supposed to simply replace GL2 as an API supporting current generation hardware (or in theory the same hardware as GL2).

Whatever we think up just doesn't make sense to keep silent about. It's stupid.

knackered
05-31-2008, 09:08 AM
knackered: I value your professional opinion about 3D graphics stuff, but i really don't care for your political views, please spare me those in the future.
If this were an email conversation, I'd respect that wish. But seeing as though it's a public forum, I'll ignore that wish.

I like a good bicker. I'm not happy unless I've got something to moan about.

NeARAZ
05-31-2008, 09:12 AM
One thing is sure, OpenGL as cross platform graphics api should not die, whatever it happens and must be updated and regularly by a competent company/group/whatever.
Everything happens because there is a need. If a supposedly cross-platform API dies, then oh well, apparently there wasn't enough need to have it.

One way or another, there will be a way to access the graphics hardware. Whether it is called OpenGL or anything else, and whether there is a single/similar way to access hardware on different OS/platforms - not so much of an issue (at least for me).

bobvodka
05-31-2008, 09:26 AM
It's NOT Microsoft. It's annoying to keep hearing that one come up.


You say that, but it could be.
Consider that the OpenGL 3 API is very close to the D3D10 API in its method of working; what if MS had a patent related to that and decided to send the lawyers in?

Is it likely? Probably not... but in the absense of anything concreate wild speculation is the order of the day :D

Lord crc
05-31-2008, 10:24 AM
Imo, there's no technical (or similar) reason which can explain why they haven't said anything at all about what's holding them back. At least they could have admitted that they've tossed any plans of an immediate release (back in november or whatever).

Leadwerks
05-31-2008, 12:46 PM
It is true that a great many powerful companies are heavily invested in console technology right now, and have a motivation to not want PC graphics to advance. Look at the shitty engine Epic sells compared to the superior Cry Engine 2. Obviously Epic wants to keep the consumers content with old lightmapped technology, and does not want people to expect graphics the console systems cannot run. Khronos' money comes from Sony, and Sony obviously has no interest in PC graphics being better.

It's not a matter of speculating who is conspiring against us, it is more just a matter of looking at where the money comes from and who has motivation to retard PC graphics. I don't think OpenGL 3 is going to run on the Playstation, so why would Sony want PCs to have a better graphics API?

So when I first heard that the Collada people were going to be running OpenGL, I was not happy. I also thought it was laughable that MS and Epic are members of the "PC Gaming Alliance". These are two companies that benefit from PC gaming being f*cked up!

It's obvious it's being held up by IP issues, but they could at least say "It's being held up by legal problems" rather than just give us the finger.

V-man
05-31-2008, 02:56 PM
It is true that a great many powerful companies are heavily invested in console technology right now, and have a motivation to not want PC graphics to advance. Look at the shitty engine Epic sells compared to the superior Cry Engine 2. Obviously Epic wants to keep the consumers content with old lightmapped technology, and does not want people to expect graphics the console systems cannot run. Khronos' money comes from Sony, and Sony obviously has no interest in PC graphics being better.

It's not a matter of speculating who is conspiring against us, it is more just a matter of looking at where the money comes from and who has motivation to retard PC graphics. I don't think OpenGL 3 is going to run on the Playstation, so why would Sony want PCs to have a better graphics API?

So when I first heard that the Collada people were going to be running OpenGL, I was not happy. I also thought it was laughable that MS and Epic are members of the "PC Gaming Alliance". These are two companies that benefit from PC gaming being f*cked up!

It's obvious it's being held up by IP issues, but they could at least say "It's being held up by legal problems" rather than just give us the finger.

GL3 is a API cleanup and some small amount of performance and consistency. It won't make PC gaming better.
Also, GL3 can be made available for Playstation 3 and quicker since the driver design would be MUCH simpler.

Jan
05-31-2008, 03:31 PM
I don't know about the PS3's 3D graphics API, but i would assume, that Sony might at least consider to implement GL3 on it, if it matches its hardware (ie. has no special features, that are not handled through GL3).

The point is, if the PS3 did allow to use GL3, it would attract more developers to port to the PS3 (or, it would make it EASIER), since people would then only need to support 2 APIs (D3D9 and GL3), whereas today they need 3, if they want to include MacOS (and Linux).

On the other hand, what i heard about Sony, they are still more of a hardware-company and their tools are supposed to be quite limited (more information from people who know what they are talking about is much appreciated!), so maybe they just don't care much and think it is the developers problem, not theirs.

Anyway, having more options is usually a good thing.

Jan.

Zengar
05-31-2008, 03:55 PM
I don't know if this thought was already expressed by anyone, but is there really a point in a new graphics API? Look at the modern cards --- they evolve rapidly to general-purpose stream computing devices. If this continues, in 3-5 years we will have a need for a cross-platform stream computing API (like CUDA), which will make any graphics API obsolete. That is, even if hey bring GL3 out, it won't last for longer then maybe six years.

Jan
05-31-2008, 04:32 PM
Yes, actually that thought was already expressed! I think it was somewhere in this thread, or in the "No newsletter out yet"-thread.

Korval
05-31-2008, 05:08 PM
is there really a point in a new graphics API?

Yes. Because no matter how general-purpose a GPU becomes, it's still a GPU. And the most effective way to set that GPU up is not something that a graphics programmer should have to know.

GP-GPU APIs and the like are not good graphics APIs.

Seth Hoffert
05-31-2008, 05:16 PM
I have to agree with Korval here... and plus, there are still things on a graphics card that simply aren't generalized (such as texture-related hardware).

Leadwerks
05-31-2008, 05:22 PM
GL3 is a API cleanup and some small amount of performance and consistency. It won't make PC gaming better.
One of PC gaming's biggest problems is bad drivers. Do you think it is in Microsoft or Epic's best interests that PCs have reliable drivers, when they are so heavily invested in obsolete console technology? I am not saying they are responsible, but I am saying there are large companies that should see OpenGL3 as a minor threat, or at least not something that they are going to benefit from. And I would not put it past any of those companies to drag their feet on coming to an agreement about some bit of IP they hold.

I think that is the best guess possible of what has happened. We all know it is an IP issue, or they would have provided some explanation. And there are motivations for companies involved to not fully cooperate with Khronos. I don't know if this is coming from Microsoft, Sony themselves, or someone else, but it looks to me like someone has sicked their lawyers on Khronos at the last possible moment.

bobvodka
05-31-2008, 05:23 PM
On the other hand, what i heard about Sony, they are still more of a hardware-company and their tools are supposed to be quite limited (more information from people who know what they are talking about is much appreciated!), so maybe they just don't care much and think it is the developers problem, not theirs.

I work for a company which does games for various platforms, right now I'm working on an eyetoy game for the PS2. The tools we have for that are not what you'd call top of the line.

The compile is GCC, which is probably the best bit about it; the IDE is Codewarrior, known through the land by people who have to deal with it as 'the biggest pile of crap every to be called an IDE' (see below for an example).

I understand things aren't much better in PS3 land either, I dare say if I move to a PS3 project when this one is done I'll find out.

On the plus side you do get very detailed docs from them, both in paper and pdf format which cover basically everything about how the chips work. Although, I'd perfer better tools :(

---
So, it's thursday evening, about 5pm and I wander over to a co-worker desk to kill some time as I was feeling lazy that day. I find him looking puzzled at some spotlight code in the engine which didn't seem to work. The main issue he had was, when tracing into a function with an asm block in it the correct result never seemed to be returned. I was surprised by this as it was the dotproduct() function, one which is used all over the place and you'd have thought it would have been noticed before now.

So, he runs the program, we trace into it and low and behold the debugger says that our return variable is never written to and as such the returned value is rubbish. Due to the time he was going home so we deffered it to our lead to sort out the next day.

Next day I wander if at 10am (yay flexible hours) and at about 10:30am notice our lead was in but hadn't been to his desk. So, I was to my co-workers desk from last night (which is in a side room) to find him and my lead, PS2 manuals out, trying to find out wtf is going on with this function.

Long story short I cracked in the end as I had slightly more knowledge of assembly than the others looking at it (at one point there we 4 programmers staring at this problem); once I looked back at the call site I realised that what was happening was the value was being left in a register in the function call and then used directly afterwards which is why the memory location wasn't updated.

This was confusing becasue looking at the assembly generated the compiler reserves space on the stack for the variable to be returned, sets it to zero as asked and then at the end cleans it up without ever using it.

In release mode, fine, but this was debug mode. Sometimes the debugger can track things in registers but this time it wasn't playing ball either.

It was around midday I finally cracked this, after learning how PS2 assembly works, the calling convention and refreshing my mind about the gcc asm command; thats 1h 30mins of my time and around 2h of my lead and co-workers time gone because either the debugger didn't do things right or the compiler was being too smart.

Zengar
05-31-2008, 05:24 PM
is there really a point in a new graphics API?

Yes. Because no matter how general-purpose a GPU becomes, it's still a GPU. And the most effective way to set that GPU up is not something that a graphics programmer should have to know.


I disagree. A GPU that just includes multiple processing units with no specialized hardware for graphics-related tasks (excluding maybe a dedicated texturing hardware) is not a GPU in the common sense anymore, and I would expect such cards to appear in next generations (ATI cards do not even have dedicated multisampling hardware anymore, and if I am not mistaken G80 does rasterization via it's shader units). Of course it puts more strain on the graphics programmer to do "everything by himself", but I expect some open-source frameworks that take over most usual tasks (like rasterization) to emerge soon after stream processing hardware becomes common (much like programming libraries).
Also, it gives you the power to use the hardware in the most beneficial way possible for your application, which will soon result in new interesting techniques and methods.



GP-GPU APIs and the like are not good graphics APIs.

True, but it boils down to "what is actually there". This way, a graphics API would be just a layer on the top of the stream processing API.

bobvodka
05-31-2008, 05:34 PM
Right, so you accept there has to be a level above a stream processing API for graphics to work, so what better group to write that level than the people who know how the gfx card works?

And at that point why bother putting it above it? Why not just have the two APIs to avoid the call over head?

And tada! OpenGL or D3D is still needed.

Personally, I'd much rather the guys who knew about the hardware dealt with all that stuff and let me get on with telling it where I want my stuff to be drawn. I don't want to have care about these things. It's bad enough as it is, now imagine having to trawel through code to setup everything just how you need it.

The whole 'oh, someone will write a framework for it' might be great for people just playing about, but once you get out of the GLUT like sandbox world and need more control you need to start digging.

And no two pieces of hardware will be made equal. The way to treat AMD will differ from NV will differ from Intel and it's bad enough right now; imagine having to effectively do what drivers do now for bits of hardware when new stuff comes out?

*shudders*
No, a GP-GPU API is all well and good for GP-GPU apps but for a specific problem domain, such as 3D graphics, a dedicated API and driver is to be prefered.

Lord crc
05-31-2008, 09:37 PM
Yeah you can code for the x86 cpu using ASM, or you can just use one of the highly optimizing C++ compilers... I'm guessing the same will be true for whatever comes next on the graphics front.

Brolingstanz
05-31-2008, 09:47 PM
OpenGL or D3D is still needed.

Though I think the difference is that those future APIs could be written by anyone; sitting on top of some lower level plumbing they'd be more like GLU or D3DX, requiring little or no driver internals and such... just a specialized convenience layer atop a much more general infrastructure.

Korval
06-01-2008, 01:23 AM
Do you think it is in Microsoft or Epic's best interests that PCs have reliable drivers, when they are so heavily invested in obsolete console technology?

For Epic? Absolutely.

Shocking though it may be to believe, console games are developed on PCs. They use 3D tools like Max, Maya, or XSI. Some even have an in-house PC port that they use for early development (so as not to require the more optimized asset conversion process for console builds). Those tools need reliable drivers in order to function. Artists need to be able to develop shaders and see them work in those tools, so that they don't have to go through a lengthy export and asset conversion process just to see if their mesh works correctly.

For Microsoft? Even moreso.

Vista (and every post-Vista OS) is built on video drivers. Every graphical element, whether AeroGlass or not, goes through Direct3D and 3D graphics card drivers. If those don't work, the OS fails. Period.

And that's not something Microsoft wants.

Do Epic and Microsoft need OpenGL drivers in particular? No. But Epic, and any other non-Microsoft PC developer, would be entirely neutral on OpenGL 3.0.

Lastly, an aside: console technology is not "obsolete". If you're a PC gamer filled with bitterness at seeing so many companies abandon your platform for consoles, you have my pity. But keep the bile off this forum.


And I would not put it past any of those companies to drag their feet on coming to an agreement about some bit of IP they hold.

And yet, none of these companies are in a position to hold up GL 3.0's development. Microsoft long since quit the ARB. And Epic has never had any pull with them.


I'm working on an eyetoy game for the PS2.

You have my sympathy.


the IDE is Codewarrior, known through the land by people who have to deal with it as 'the biggest pile of crap every to be called an IDE' (see below for an example).

That's strange; PS2 doesn't have an IDE. If you're using Codewarrior, that's because that is what your company provided you with. I used Visual Studio (through a makefile-based project) just fine with PS2. Debugging was handled via an external application that was certainly not Codewarrior.


A GPU that just includes multiple processing units with no specialized hardware for graphics-related tasks (excluding maybe a dedicated texturing hardware) is not a GPU in the common sense anymore

I would also add that a GPU that doesn't include any specialized hardware for graphics-related tasks is not a good GPU. So it's fairly irrelevant to talk about such things. It's more like something pretending to be a GPU.


Though I think the difference is that those future APIs could be written by anyone; sitting on top of some lower level plumbing they'd be more like GLU or D3DX, requiring little or no driver internals and such...

But because low-level interfaces are so, well, low-level, you will need intimate knowledge of how the hardware works to write a higher-level interface that properly abstracts the hardware without losing performance. Hence: OpenGL. "Anyone" will not be writing such implementations; they should be written by people with actual deep hardware knowledge.

Chris Lux
06-01-2008, 02:05 AM
I don't know if this thought was already expressed by anyone, but is there really a point in a new graphics API?
if you take a close look at cude, there is not much that sets it appart from todays shaders (my personal opinion). extend the shaders to access shared resources (shared on chip memory) and scatter writes and you have all you need plus all as graphics api gets you.

trying to implement ray tracing using cuda is cool but at some point you begin to build stuff you have with opengl (like normal bindable textures or texture arrays...). i think an small extension to the cross platform shader apis brings much of the cuda capabilities. (because _we_ want to do graphics with it)

Jan
06-01-2008, 02:23 AM
I agree with bobvodka and Korval:

Even if it were possible to write a graphics API on top of some stream API, such that non IHVs could make it, where's the point?!

First of all, such a group would need to sell it's API, to make a living, instead of giving it away for free, as it is now. That is, because if it were an open source project (ie. a hobby-project) it would be too unreliable. You know, some GPUs get well supported, because the project-lead has that GPU on his computer, sometimes there are many updates, sometimes there are none for a long time...

It wouldn't work well, unless it is a dedicated company.

On the other hand, no matter how general purpose GPUs become, the fact is, that they will always be mainly used for graphics, because people buy them as graphics cards! So, even if GPUs will compete as GPGPUs in the future, they will still also compete as plain graphics cards. And that is why IHVs will keep being interested in having good graphics drivers. Most software developers are not able to write a rasterization renderer today. I couldn't. And this is A GOOD THING. Not the missing knowledge itself, but the fact that we don't need to.

What my biggest problem is with the discussion about "not needing graphics APIs in the future anymore" is, that it neglects the development in the software industry of the last 15 years (at least). Software is becoming so complex and so huge, it is not possible to write the whole code of a game in a few years, anymore. It's not like it was with Quake. All these great games today are only possible, because of heap loads of middleware, INCLUDING a graphics API. It is complicated enough writing a great renderer, no one wants to be forced to also write the low-level stuff. Especially not for different types of GPUs.

Of course i agree, that it is nice to be ABLE to write your own graphics stuff using a stream API. Just as it is nice to use assembler to interface more directly with your CPU. But only because you will be able to do it, should not mean that you should need to do it (or rely on some other non IHV company to do it for you).

Jan.

Brolingstanz
06-01-2008, 02:25 AM
"Anyone" will not be writing such implementations; they should be written by people with actual deep hardware knowledge.

Deep hardware knowledge and, dare I say, a flare for the extraordinary ;-)

Brolingstanz
06-01-2008, 02:41 AM
First of all, such a group would need to sell it's API, to make a living, instead of giving it away for free, as it is now.

True enough; there is the whole economic facet at work here. Don't know all the angles myself but you can bet your bottom dollar that no company or organization is furthering OpenGL from the goodness of their hearts or as part of some idealistic crusade for cosmic justice. That's fine, of course, but a point probably worth bearing in mind.

Seth Hoffert
06-01-2008, 08:20 AM
if you take a close look at cude, there is not much that sets it appart from todays shaders (my personal opinion). extend the shaders to access shared resources (shared on chip memory) and scatter writes and you have all you need plus all as graphics api gets you.
I agree. It would be nice to see shaders be able to do the same things as CUDA so we wouldn't have to mix two different things which both use the GPU (it feels so impure). :) Transform feedback is a nice solid step in that direction. But, it would be nice to be able to write a convolution in a shader and have it be just as fast as the CUDA version (by using shared memory explicitly). After all, shaders should be good at that kind of thing right? :D

knackered
06-01-2008, 10:40 AM
My word, this discussion is now referenced in the OpenGL Wikipedia article.
http://en.wikipedia.org/wiki/OpenGL#cite_note-14

Zengar
06-01-2008, 11:12 AM
Wow, I feel like a part of history :D

knackered
06-01-2008, 01:59 PM
I do hope that me putting a link back to the article that in turn links to this discussion doesn't cause a negative reality inversion, killing us all in a blaze of logic.

bobvodka
06-01-2008, 02:21 PM
Won't be the first time it's happned...

Leadwerks
06-01-2008, 03:07 PM
Lastly, an aside: console technology is not "obsolete". If you're a PC gamer filled with bitterness at seeing so many companies abandon your platform for consoles, you have my pity. But keep the bile off this forum.
I don't really play games. But I know that Crysis is the most graphically advanced game I have ever seen, by far. And according to Crytek the consoles don't have enough power to run the game at high settings. Consoles are already obsolete, and the gap between high-end PC performance and consoles will widen a lot more before the next generation of consoles appears.

The XBox 360 uses DirectX 9. The current version of DirectX is 10. The market is still catching up, and we won't see the effects of this for a while, but it is inarguable that console technology is behind the PC, and that discrepancy will become more significant with time.

bobvodka
06-01-2008, 03:33 PM
Which is a great rant and all but you are missing a couple of key points;
1 - not everyone has a high end PC
2 - console's look 'good enough', in fact Lost Odyssey looks down right amazing (and that is UT3 based)
3 - having a constant platform makes it easier to buy games

So, sure, consoles might not have the same graphics power as the latest NV or AMD cards, but then most people don't have those cards either.

Also you don't have to deal with windows, drivers, installing, setting up and all that. Buy game, put game in, play game.

That's why people have consoles, not to have the latest tech. and frankly given the size of the console market I very much doubt anyone with a vested intrest is worried about PC. Sales figures alone back this up.

Jan
06-01-2008, 04:05 PM
Crysis does look indeed pretty amazing. But it is a boring game.

In the end, all that matters, is game-play. And consoles are capable of delivering what is needed for innovative game-play (e.g. Assassin's Creed as innovative game-play and it runs well on consoles). That is only the case since that last generation of consoles, all earlier consoles were at some point the limiting factor, not only for graphics, but even for some new game-play ideas.

Consoles are limited, compared to PCs, but it seems not to be such a big issue anymore. 99% of all games are now done for consoles and then sometimes ported back to the PC. Seems the extra power that a PC can give you is not needed to realize most modern games.

The current generation of consoles will last very long, even if it is already outdated. I assume that the next generation will actually be only a PC with fixed hard- and software (not fully there, yet) and i assume it will make consoles even more popular, because i think that games are converging to some high standard, that will be difficult to top (content-wise), thus not requiring the most high-end hardware available, anymore.

Jan.

Korval
06-01-2008, 04:56 PM
it is inarguable that console technology is behind the PC, and that discrepancy will become more significant with time.

There's a difference between "less advanced" and "obsolete". The latter means that it is useless, while the former simply means that it isn't as powerful.

Leadwerks
06-01-2008, 05:11 PM
You can talk about the market, sales, etc., but the fact is that consoles are always going to be locked into outdated technology. The current state of affairs is an artificial one created by marketing, but it won't last. And this all relates to my point that many of the big players have a vested interest in retarding technology.

bobvodka
06-01-2008, 05:30 PM
Well, yes, they are all locked at the technology which came out when they were released and this is a GOOD thing.

The PC is a totally moving target, you are not sure on specs nor OS and support drivers at any given point. With a console, you buy it once and for the next X years (where X >= 5 at least) you can play any game released on that platform. ANY.

This makes life easier for everyone, at worse now you have to deal with HD vs non-HD on a given platform (something I've not had to worry about being on a PS2 game).

After X years have passed you reinvest and you get another X years of game play.

What would you have them do? Release updated consoles all the time? That kinda goes against the idea and complicates things no end. I'm not sure how this is a "marketing" at all; the technology is fixed, games work, people want to play games.

I've got a decent enough (gfx wise a GF8800GT512, and a 2.2Ghz X2 CPU and 4Gig ram) and stable PC but most of my gaming hours are spent on my 360, why? Because I don't have to care about installing things; I put the disk in and I go.

Sure, I could play games which look a tiny bit better, but frankly I'm having the trouble telling and if Crysis requires me to splash out another few hundred quid I'll stick to playing my Unreal Engine 3 powered 360 games thanks, they look great anyway regardless of the tech behind them.

Korval
06-01-2008, 06:17 PM
And this all relates to my point that many of the big players have a vested interest in retarding technology.

That presupposed that the PC gaming scene matters in any way to console gaming. That PC gaming would be a threat to consoles if something weren't holding the PC back artificially.

Sorry, but that's total nonsense. PC gaming is too complicated for most people. It requires downloading patches, drivers, updates, having a computer with certain special hardware to begin with, etc. What most people want is a gaming appliance. You put the game in, you play. End of story.

Wii is currently outselling everything in the gaming space (except the DS). And guess what? It is using a graphics processor that was "obsolete" a good 7 years ago.

In short, the very presumption that graphics = sales is factually wrong. PC gaming is no threat to consoles, so long as it remains much more complicated to use as a gaming device. And better graphics won't fix that.

Leadwerks
06-01-2008, 07:06 PM
OpenGL 3 isn't about better graphics. It is about more reliable drivers, and thus simplifying PC gaming significantly.

Mars_999
06-01-2008, 08:31 PM
Which is a great rant and all but you are missing a couple of key points;
1 - not everyone has a high end PC
2 - console's look 'good enough', in fact Lost Odyssey looks down right amazing (and that is UT3 based)
3 - having a constant platform makes it easier to buy games

So, sure, consoles might not have the same graphics power as the latest NV or AMD cards, but then most people don't have those cards either.

Also you don't have to deal with windows, drivers, installing, setting up and all that. Buy game, put game in, play game.

That's why people have consoles, not to have the latest tech. and frankly given the size of the console market I very much doubt anyone with a vested intrest is worried about PC. Sales figures alone back this up.

What are you talking about, installing? Are you serious? I have a PS3 and I know 360 is the same way, you will install the game onto the console just like a PC, so that statement is FALSE! and you will put the disc into the console just like most PC games have to, but unlike consoles PC's sometimes allow you to bypass using a DVD check, e.g. "Sins of the Solar Empire." Drivers don't get me started, there are new firmware updates all the time for the PS3 and I bet for the 360 also, and if not for the 360 the next one will due to it will be a PC with a console shell just like the PS3 is. And market share of PCs with GF7 series or x800 series GPUs is in the millions. So what ignore that market share? That sounds down right stupid from a business stand point.

Leadwerks
06-01-2008, 08:52 PM
Mark my words, when the GEForce 8800 GTS falls below $100 you will start to see a shift the way I am describing. It won't be long before gamers just expect to see unified soft shadows on everything, and Crysis-like graphics will be considered standard.

pudman
06-01-2008, 10:15 PM
It won't be long before gamers just expect to see unified soft shadows on everything, and Crysis-like graphics will be considered standard.

I don't expect that at all.

On a console, I tend to have higher expectations for gameplay, not graphics. If I wanted fancy graphics I'd invest in my PC.

On a PC, I do enjoy bells and whistles but Crysis-like graphics? Who would care in a game like Starcraft2? The Sims? Crysis-like games do not dominate the market by far. Nor will they when DX10 hardware gets cheap. *Fun* games do. Graphics do not make a game fun.

On GL3, were you arguing that it would help bridge the usability gap between PCs and consoles given how magically easier it is to write drivers for it? Don't forget how many variations of hardware will have to be supported, regardless of the ease of driver development.

NeARAZ
06-01-2008, 11:09 PM
Mark my words, when the GEForce 8800 GTS falls below $100 you will start to see a shift the way I am describing. It won't be long before gamers just expect to see unified soft shadows on everything, and Crysis-like graphics will be considered standard.
In the AAA PC games area, maybe. But then AAA games on PC are indeed a fading out market. Why? Simply because consoles are easier and cheaper.

In the "casual" PC games space (which is huge), I doubt that GF 8800GTS will be the norm anyday soon. Sure, I'd love it to be the norm, but as of right now, the kings there are Intel 945/915/965, GeForce 6050/6100, Radeon X300/Xpress200, S3 UniChrome. That's quite a leap backwards from 8800GTS. Compared to those, consoles don't look that much "obsolete", don't you think?

(and for the sake of facts, Xbox360 does not use DX9... sure, it's tech of a similar level, but it has some features that are roughly DX10 level or, dare I say, even beyond that)

Mars_999
06-01-2008, 11:34 PM
You are comparing apples to oranges here with casual PC gaming vs. Consoles. If you are saying casual gamers use IGP for GPUs then they never bought there PC to play games with in the first place. But console gamers did buy there console to play games with. So if you are comparing what is the norm, then you need to look at what PC gaming rigs have, so go look at Steam statistics, they have a good distribution of what is being bought. PS3 hardware shares a lot of inner workings with NVIDIA 7800 and the Xbox 360 hardware in many ways the precursor to the R600 desktop PC graphics card series, so they aren't DX10 or better, first off it doesn't even have a GS in the shader pipeline, that would be the first requirement for DX10. Next you need 32bit floating point texture filtering for DX10.1, just to name a few.

Brolingstanz
06-02-2008, 12:31 AM
OpenGL 3 isn't about better graphics. It is about more reliable drivers, and thus simplifying PC gaming significantly.

Yes indeed. PC gaming, CAD/CAM, science and engineering, industrial/military simulation, and probably a few hundred other applications, just to name a few... a veritable smorgasbord of GL3 goodness right around the corner.

But I'm with you in that PC gaming is at the top of my list, for better or worse, until death do us part. Just can't beat the cheat potential ;-)

Korval
06-02-2008, 12:52 AM
It won't be long before gamers just expect to see unified soft shadows on everything, and Crysis-like graphics will be considered standard.

Doom 3 level graphics aren't even considered standard.

The PC market, even at the "hardcore" end, is very fragmented. Games like Sins of a Solar Empire sell lots despite not exactly pushing graphics hardware. The Orange Box sells through the roof on a 5-year-old engine. Meanwhile games like Crysis barely break 1 million sold, despite their overhyped graphics.

Why? Because despite the PC market having the greatest potential for graphics, it's gamers are more willing than any to compromise on graphics.

Jan
06-02-2008, 01:49 AM
I think, that most hardcore PC gamers do inform themselves more than average Joe about upcoming titles. That means they DON'T buy a game even with great graphics, if the reviews say it isn't fun to play. That is, because when you buy a high-end gaming system, you KNOW what you buy that for. Most people buying a console simply buy it "because you can play games with it".

For example: Take a look at "Gears of War". It was a console-exclusive game at first and it sold like hell, because it was the first game to push the 360 to its limits. People owning a 360 wanted to see that. But it was really overhyped. All the console-magazine reviews said it was the greatest game ever!

Few years later it is released for the PC and although it still gets top marks for graphics, many magazines give a more differentiated review, saying that the game-play becomes a bit boring over time, because it's really always the same.

I played it through and in my opinion it is a nice game to play once. Not comparable with games like Half-Life or Max Payne, which i cannot count how often i played them.

Jan.

Dark Photon
06-02-2008, 04:29 AM
It's NOT Microsoft. It's annoying to keep hearing that one come up.
Maybe, maybe not, but prove it!

You're speculating as much as he is, and he's got as much right to it as you do.

(...unless you have some verifiable inside info you'd care to share.)

Personally I give his guess some potential given all the anti-competitive back-stabbing crap MS has pulled over the years. They aggressively strive to be the only solution available, by any means possible. OpenGL is an annoyance to that goal. Couple that with annoying facts like NVidia likes OpenGL, G80 features were available for OpenGL before DX10, MS has had trouble inflicting DX10/Vista on its user base, etc. GL was really getting on MS's nerves.

...and then, dead stop.

knackered
06-02-2008, 05:06 AM
yep. GL3 would have been a significant threat to the potential uptake rate of vista.

NeARAZ
06-02-2008, 05:16 AM
It's NOT Microsoft. It's annoying to keep hearing that one come up.
Maybe, maybe not, but prove it!
I believe OpenGL 3 is being help up by Apatosaurus Named Brontosaurus. Prove that it's not the case!

bobvodka
06-02-2008, 06:03 AM
yep. GL3 would have been a significant threat to the potential uptake rate of vista.

Well, no, it wouldn't have because OpenGL3.0 is pitched at DX9-class hardware.

Vista, on the other hand, allows DX10 features and while you can get at them with OpenGL on NV hardware in XP that's pretty much it.

So, no, OpenGL3.0 wouldn't have made any difference at all in that regard as it's just a reskin of old technology. Now, Mt. Evans might well have made/will make a difference but at this point you've got a couple of major hurdles to get over namely;
- large existing DX based code base
- crap or non-existant OpenGL debugging tools

DX has tools for everything, from the basic PIX which comes with the SDK (and is pretty good on it's own) to the full blown PerfHUD NV do.

Now, OpenGL, what is there?
- glIntercept is handy but I disliked having to dig through the created XML files
- gDEBugger, which is pretty good however costs a small fortune

So, with a lack of tools and an established code base what reason would anyone have to switch to OpenGL from DX? Maybe.. MAYBE at the time the swap towards DX10 based renders was happening Mt. Evans could have grabbed some share, but right now we are still waiting on GL3.0 and if it's GL3.0 sans Mt. Evans which gets released at Siggraph we are still looking at another couple of months until the GL3.x spec for Mt. Evans appears. Not to mention the run off time to make drivers, test and debug them.

Time passes, Vista sells more, companies work on D3D10 technology, the window is lost.

Right now, OpenGL is NOTHING to MS; D3D is the dominate API on Windows, only one engine uses OpenGL and that is pretty much sidelined by the D3D9 powered Unreal Engine which is showing up in everything. OpenGL3.0 could have at least cleaned up the API but without the rest of the supporting technology D3D is still going to wander around like the king it is.

Now, on the charge of MS having anything todo with this slow down, as I previously mentioned unless it was a patent issue with some technology such as the S3TC compression or something else they picked up on when the great SGI IP sale happened they won't have a say in what goes on.

And if it was/is an IP issue, well, tough luck really. They have the right to do it and you can complain about 'dirty tricks' and 'underhandedness' as much as you like, the facts are in this commerial driven world of ours it's up to MS what they do with what they own.

Chances are we'll never find out what kept things held up, in the end it'll be released at SIGGRAPH and everything moves forward from there. Of course, the chance to make a dent in D3D's domination of the windows platform will probably have passed by then, but never mind hey.. there is always next time...

Nicolai de Haan Brøgger
06-02-2008, 07:07 AM
Now, OpenGL, what is there?
- glIntercept is handy but I disliked having to dig through the created XML files
- gDEBugger, which is pretty good however costs a small fortune


Now it seems like ATI has better tools than NV that supports GL/GLSL. The ATI tools are free but NV just refers developers to gDEBugger :p
PerfSDK support for OpenGL is minimal compared to DX (e.g. performance counters). Where is the GL support in PerfHUD?!

bobvodka
06-02-2008, 07:27 AM
Currently, we only have the resources to support one API for PerfHUD, but we'll definitely keep your feedback on OpenGL support in mind for the future.


By an NV representive on the NV dev forums; link (http://developer.nvidia.com/forums/index.php?showtopic=1562&view=findpost&p=3693)
.

On AMD's tools, I can assume you are talking about GPU PerfStudio? I tried this while I still had my X1900XT and despite following the intructions to the letter is was a fun game of 'watch it crash and not work'. Half the time it wouldn't connect to my OpenGL app, the other half it just locked up and died on me.

Demirug
06-02-2008, 09:31 AM
PerfSDK support for OpenGL is minimal compared to DX (e.g. performance counters). Where is the GL support in PerfHUD?!

Nvidia commented these questions in the past.

The majority game developers (which are the primary users of these tools) prefer to see more features on the Direct3D side instead of an OpenGL version. Therefore the budget is spend for Direct3D.