Barthold, thanks for the response too
Maybe, it's the idea of stopping communication with the community that make us so angry... I guest that the ARB wasn't expecting a better reaction from the community?
Barthold, thanks for the response too
Maybe, it's the idea of stopping communication with the community that make us so angry... I guest that the ARB wasn't expecting a better reaction from the community?
No, sorry. The real 3.0 (aka Longs Peak) was more than just removing cruft. It was a different API that removed and changed the fundamentals of how things work. Tossing out the old API was just step 1 for LP. The new object model, the way object creation was atomic and inherently threaded, the way object creation was success-or-fail (taking away the guessing game that is OpenGL), etc. All that stuff is gone and will never see the light of day.3/ A 3.0_Forward_Compatible Context which is the "Real" version 3.0 with the legacy stuff removed.
As I pointed out in my rebuttal to Barthold, features aren't what GL 3.0 was supposed to be about. It's like if someone promised you a car, but got you a boat instead. Yeah, a boat is kinda nice and all, but I can't drive on land with it.To be fair you need to visit his web site, he has a nice summary that's not as gloom and doom as your assessment.
Nobody is yelling. This was already outlined by Rob Barris, it is not confusing. The problem is it is not a simplifying plan, quite the opposite.Originally Posted by Simon Arbon
3.0 with legacy is not 2.2, in those terms it's a superset (although not exactly) and a key priority should have been to avoid this. 3.0 forward compatible will coexist. Going forward when 3.1 arrives we're told that whether or not the deprecated functionality is actually unsupported or slow is entirely up to the vendors. There is no 2.x and 3.x path which would have been a damned sight cleaner that the proposed plan. There will be the superset and the forwards compatible version with the superset having some TBD support for legacy (in competition with other vendors).
You don't move away from the burden of legacy support by doing something that significantly complicates the driver situation offering a matrix. You'd be forgiven for seeing it as the worst of both worlds.
It's clear this is an attempt to be very developer centric, but well, it is what it is.
Oh great, i just get my application startup to under half a second and you decide to make extension string processing SLOWER.Originally Posted by glspec30.20080811.pdf
I know a lot of people seem to have trouble reading a PCHAR, but then the spec itself doesn't help:
WRONG! people.Originally Posted by glspec30.20080811.pdf
There is absolutely no reason at all to make an identical copy of a string you already have, it should simply be scanned from beginnning to end while checking the extension names. This footnote is supposed to be a warning but instead is telling people its OK to do the wrong thing.
There have been many posts to the suggestions forum suggesting better ways to detect extensions such as providing those with official extension numbers as a set/bitmask, or providing an array of pointers to the start of each one.
But the *GetStringi( enum name, uint index ); option should only be there to help people with limited programming skills, those who know what they are doing should still have access to the whole string.
Actually there is one way that Extensions should be improved, the application should be able to say which version of OpenGL it was written for and ask for only those extensions that are not core in that version.
That's all well and good, but I really think one thing should be done: The spec spends a lot of words talking about functionality that is then deprecated in an appendix. I think EACH AND EVERY deprecated function should have a big red DEPRECATED stamp next to its description. And you should also release a second PDF that simply does not contain the deprecated functionality.Originally Posted by Rob Barris
i miss the point not completly. BUT one point of OpenGL 3.0 was to make the development of drivers much easier. now with several profiles to worry about how in hell is this easier on the driver developers? you _have_ to support the legacy profiles and do want the performance enhancements possible with the reduced profile. they to interact and they surely do not want to maintain x driver profile code branches.Originally Posted by Rob Barris
so tell us how can this improve overall driver quality?
Sounds like a new total chaos era..
I agree that a lot of what we were promised is missing and i am very disapointed, but i am commited to a Windows/Linux/Mac application so i have no choice but to make do.Originally Posted by Korval
My previous job was as a quality control engineer and if anyone had suggested this "Depreciate and evolve a bit at a time" plan to me i would have had them sacked, its only asking for trouble.
But who in their right mind is going to use 3.0_Full anyway, you would have to be crazy.Originally Posted by dorbie
The 2.x support needs to stay so old applications still work, but if most people start using 3.0_Forward_Compatible immediately then the vendors can concentrate their efforts on getting this working well and nobody will care if 3.0_Full is full of bugs as no-one will be using it.
You've just said this spec is 3 times the size it needs to be and legacy support with all its burdens is now essential just to support crazy people.Originally Posted by Simon Arbon
You go on to outline a nightmare that can be avoided just have 2.x and 3.x forward, no superset. It's easier, less buggy and gets everyone where they either want to be or deserve to be faster.
The problem is, we all expected and were looking forward not to new version of OpenGL, but to a new API. Instead, ARB sadly desided to be "politically correct" and make the transition slowly. Now, we all know that this kind of stuff does not work.
There are several people who pointed out that writing a new driver for a LP model would be too much work. I have to disagree. The most complicated parts - GLSL - is already there in current drivers and the rest could be done quickly by writing new interface functions for the driver core; as LP was sopposed to do the same thing as GL 2.1 but simpler. Deprecation model? well, Dorbie and others are right... we need a "new features only spec" ASAP!
Of course, the GL3 spec is not so bad, if we look at it as a 2.2 revision. Lots of interesting things are in the core now. I don't really miss the geometry shaders, I think the desision to keep them as an extension was right. The more ES-similar shading language is also a very good idea.
But this spec does not adress one of the most important problems of GL: bad driver support. It does nothing to ease the life of the driver developers, as LP should have done. Result: Nvidia will have new drivers soon, ATI won't, Intel... well, who cars about them anywayWhile GL3 supports new features, there is nothign in it that would make it attractive for the developer or the IHV. Therefore, the stagnation just continues. GL is more and more becoming a dead standard, a bloated mass of spec with no particular sence. It still includes EVIL features like selection mode despite driver vendors having stopped to care about it long time ago, only because some CAD guys with total inability to write good code (who don't need the new features anyway and could have stayed with 2.x). You can't please everyone, either you do the things right (and then you MUST break the backward compatibility), or you stay with the old API model (=bad drivers, guesswork, inconsistencies etc.).
Finaly, I would liek to repeat my point: it was never about the new features, guys, it was the new API that was needed and waited for! This is the reason why we are so upset...