PDA

View Full Version : OpenGL at GDC



Groovounet
02-11-2011, 09:32 AM
Interesting program at GDC.

http://www.khronos.org/news/events/detail/gdc-2011-san-francisco

So WebGL 1.0 to be announced. What about OpenGL? According to the update rate of OpenGL specification, we should have OpenGL 4.2 release... I remains unsure about that but after all I had doubt at Siggraph 2010 and we got OpenGL 4.1.

What do you think:
- Do we need OpenGL 4.2?
- Do we need an update of OpenGL 4.1 that fixed the specification bugs
- Do we need OpenGL to be more stable?

Alfonse Reinheart
02-11-2011, 10:47 AM
- Do we need OpenGL 4.2?

Yes. There's functionality we should have that we don't.


- Do we need an update of OpenGL 4.1 that fixed the specification bugs

I'm not sure what bugs in particular you're referring to, but bug fixes should not be part of any timetable. They should just update the spec with fixes whenever they have a fix.


- Do we need OpenGL to be more stable?

What do you mean by that? The spec being more stable, or implementations?

glfreak
02-11-2011, 11:40 AM
Lets take it more steps further and say OpenGL 4.5 maybe?

Conformance test certification?

Groovounet
02-11-2011, 03:35 PM
Conformance test... at every BOF we have the same "This time it's about to be done".

There are some tools that should be supported like Glean https://sourceforge.net/projects/glean/ and dEQP http://www.drawelements.com/

For stable I meant specification, it keeps changing and follow it is just good for few people really into it. The (many) others are like "what?" and barely goes beyond shaders. To generalize modern OpenGL programming, I think we need more stability but better documentation, better drivers and better programmers (we need to learn too!)

Then OpenGL doesn't expose everything so I think it needs to continue to evolve but in a way that the paradigm doesn't drastically after a specification update. OpenGL 4.1 as basically no OpenGL 4 hardware features but the API and its use as changed a lot with the separate program stuff.

Alfonse Reinheart
02-11-2011, 04:18 PM
Glean https://sourceforge.net/projects/glean/

Going solely by the documentation on that page, Glean is not a project that should be "supported" in any way. They can't even be bothered to make actual releases, forcing you instead to download from CVS. Their documentation is ancient, asking MSVC users to use STLPort (which hasn't been updated in 2.5 years. And we're long past the days when VC++'s STL implementation had problems), needlessly requires a simple external library that could have been included in the distro (libtiff), etc.

This feels, in every way, like the worst aspects of Open Source development. That UNIX-ian philosophy of make everything as needlessly difficult for the user as possible.

I'd much rather the ARB hire someone to just do it the right way.


For stable I meant specification, it keeps changing and follow it is just good for few people really into it. The (many) others are like "what?" and barely goes beyond shaders. To generalize modern OpenGL programming, I think we need more stability but better documentation, better drivers and better programmers (we need to learn too!)

Of course the spec keeps changing; they keep adding new things. Do you want them to stop?

Also, the ARB has no control whatsoever on any of the three points you mention. Well, they do have the "man" pages, but that's reference documentation and doesn't always tell the whole story.


Then OpenGL doesn't expose everything so I think it needs to continue to evolve but in a way that the paradigm doesn't drastically after a specification update. OpenGL 4.1 as basically no OpenGL 4 hardware features but the API and its use as changed a lot with the separate program stuff.

The paradigm didn't change at all for 4.1. If you wanted to use program separation, it was there. But it's not like they were saying you have to use it.

Also, that's what you have to do when you're playing catch-up because you've consistently failed for a good 3-4 years to get important changes implemented in the spec. By all rights, ARB_separate_program_objects shouldn't even be necessary; that should have been taken care of when everyone hated having to link their program back before GL 2.0. But no; the all-knowing ARB decided to stick with 3DLabs's idiotic linking paradigm. The same goes with things like sampler_objects, DSA (this should have been done back when we saw how badly multitexture, itself already needlessly convoluted, interacted with shaders) and the like.

The ARB made many such mistakes. And they still haven't corrected all of them. And looking at the ARB_separate_program_objects spec, you can see that they've made more.

Until they do, there will be many such "paradigm shifts". Or would you rather not have ARB_separate_program_objects at all?

Eosie
02-12-2011, 07:27 AM
There are some tools that should be supported like Glean https://sourceforge.net/projects/glean/ and dEQP http://www.drawelements.com/
All tests from Glean were imported to Piglit, a relatively new project which contains thousands of OpenGL tests. There are also a lot of GL3.x tests like ones for EXT_explicit_attrib_location, instancing, GLSL1.3, to name a few. Most of GL3 texture formats are thoroughly tested, i.e. there are tests for texture wrapping, mipmapping, render-to-texture, and so and so forth. It also contains tests for non-core extensions like EXT_separate_shader_objects, EXT_texture_compression_latc, ARB_shader_stencil_export, AMD_conservative_depth etc. Most tests read back color/depth/stencil values and compare them with the reference values they have. Also, OpenGL ES tests are being worked on. Piglit is huge and gets bigger and bigger.

http://cgit.freedesktop.org/piglit/
(there is no official release nor a web page, just the repository)

The problem with conformance testing is that you would need millions of tests to cover whole OpenGL. It's impossible to have that without having a group of people working on it full-time for a couple of years at least. I don't think the ARB have enough resources to pay for developing such a project. Supporting and extending existing projects like Piglit may end up being considerably cheaper, since there already is a commercial interest from some companies to develop Piglit.

When the ARB say "This time it's about to be done", they certainly lie. It will never be done.

V-man
02-12-2011, 02:19 PM
[quote=Groovounet]The problem with conformance testing is that you would need millions of tests to cover whole OpenGL.

You don't need a million tests.

There are implementations that can't even render wireframe mode properly. I have seen ATI drivers that render wires all over the place under certain conditions.

Perhaps conformance tests should be published. I've never seen one. When you download a driver, it never mentions passing any tests and which one's it has failed. We get to discover them as we go along.

Groovounet
02-12-2011, 03:01 PM
When the ARB say "This time it's about to be done", they certainly lie. It will never be done.

It has been done for OpenGL ES and WebGL so I don't see why it would not be done for OpenGL. The pass has shown that it hasn't been yet and probably WebGL and OpenGL ES receive more interest by the industry than OpenGL but still, why not.

I would quite enjoy to have a look at qEQP.

EDIT: Do you know more tools doing implementation quality check?

Eosie
02-13-2011, 07:23 PM
EDIT: Do you know more tools doing implementation quality check?
Just Mesa-demos, which only contain interactive rendering tests, but that is being ported to Piglit and automatized as well. One of ARB members told me that there were some non-public conformance tests for OpenGL 1.2. I guess that doesn't count because nobody any longer cares about such an old API, right? I guess there is no such thing like conformant OpenGL anymore, nowadays it's more about de-facto.

Eosie
02-13-2011, 07:53 PM
You don't need a million tests.
That depends on what you consider "conformant". If it means GL functions should not report GL_INVALID_VALUE on valid calls, that's pretty useless. If it means that some complex rendering should be done with various GL features and interactions between them, using as much of the OpenGL pipeline as possible, and comparing the final contents of colorbuffers and a depth-stencil buffer with reference values (with some little inaccuracies allowed), then you will need a fair amount of tests for it to be useful.

It's not just wireframe that's sometimes broken. If you use ARB_texture_swizzle with some particular formats on some particular drivers, you will notice that the channels are swizzled incorrectly. That's just an interaction of 2 features: the texture format and texture swizzle, yet still some drivers aren't doing it right. (this and other bugs in proprietary drivers have been automatically discovered by Piglit) Another thing is that indexing arrays in ARB_vp with a negative offset (as in "array[x-2]") isn't working on some drivers either, even though the spec allows it. And I could go on.

Alfonse Reinheart
03-05-2011, 05:18 PM
Sorry for the bit of necromancy, but GDC is over now. There was a presentation made about OpenGL. Does anyone know anything more than that?

Aleksandar
03-06-2011, 12:31 PM
I'm also interested in what was happened.

The new revision of OpenGL is not released for sure. Maybe your critique of inconsistency and immaturity of previous releases make them more careful. ;)

Heiko
03-07-2011, 12:46 AM
It looks like we don't get a new OpenGL version yet, even though there are some interesting extensions that would clearly make a nice addition to core. I hope we get some news from Khronos about their plans with OpenGL: the consistent releases of the past two years raised the expectations for another release during the GDC.

barthold
03-09-2011, 12:26 PM
The GDC presentations were focussed on OpenGL 4.1, plus the release of the GLSL man pages (thanks to Graham!). The slides should be up shortly.

As I've consistently said, the ARB is both feature and schedule driven. We are definitely working on the next version of OGL, and I think you'll like it when it'll be released.

Barthold
(With my ARB hat on)

Groovounet
03-10-2011, 09:03 AM
Available:
http://www.khronos.org/developers/library

I guess we could have had waited until Siggraph 2012 for "a Sentimental Mood" and other "Love story". I always look think with skepticisum when the topic goes to either commemoration or congratulation leaving behind real new content, ideas and novelties.

Then, I wasn't expecting much including OpenGL 4.2. However, if we are going to like the next version of OpenGL, why not giving some clues about the topic of interest at GDC?

In any case a big thanks to Graham who has been the best OpenGL ecosystem contributor during the past few years with the OpenGL 3.3 and 4.1 man pages, the OpenGL Superbible 5th edition and now the GLSL man page!

mfort
03-10-2011, 10:55 AM
The biggest news for me is the dawn of Conformance Test Suite for OpenGL. It is definitely a long term goal. We should all benefit from it's existence. Of course it all depends how they will maintain it. If they are going to add tests for bugs discovered by OpenGL community. If they are going to put tests for older bugs to avoid regressions.

Alfonse Reinheart
03-10-2011, 11:38 AM
I always look think with skepticisum when the topic goes to either commemoration or congratulation leaving behind real new content, ideas and novelties.

To be fair, looking at the WebGL presentations, they were all mostly fluff. So it's not like OpenGL was short-changed in that department.

It's surprising that the Game Developer's Conference was used for marketing fluff. Game developers need to know what this stuff is and how to use it. Game developers don't need buzzwords and nonsense; they need real, factual data about what it is they're expected to use.

It's sad to see Khronos so out of touch with some of its users in this respect.

Chris Lux
03-11-2011, 01:58 AM
Yes, there was a time when new tech demos were presented and explained using OpenGL. Now its all D3D11.

I still hope that the future of OpenGL will still be in high-end desktop/workstation graphics and not just in the mobile/low-end market as it comes across from the presentations.

Nvidia has so many OpenGL-people on board, why not do some exciting tech-demos using OpenGL again?

Eosie
03-11-2011, 12:17 PM
There was even a time when Windows Vista was released and D3D10 was basically unused. If OpenGL had been redesigned from scratch back then, we would be in a better position now. It was a great opportunity to fill the void with good stuff. Now it's too late. D3D11 is too strong and the API is clean and nice, which can't be said about OpenGL.

I don't believe both major hardware vendors will cooperate to make a comprehensive set of conformance tests. One alternative is to use Piglit, which looks like this (http://people.freedesktop.org/~mareko/piglit/). It's command-line only, but can generate an HTML report. The project clearly shows that the drivers from both hardware vendors are broken and in different ways, and it's easy to crash them when doing even simple things.

Alfonse Reinheart
03-11-2011, 12:20 PM
I don't believe both major hardware vendors will cooperate to make a comprehensive set of conformance tests.

But they're not involved in making the conformance tests. Khronos is funding the development themselves.

Chris Lux
03-11-2011, 08:58 PM
very sad, but very true:

http://www.bit-tech.net/news/gaming/2011/03/11/carmack-directx-better-opengl/1

Heiko
03-12-2011, 02:42 AM
I'm looking forward to the conformance tests. I think its great to see progression on that part of OpenGL. We have a nice spec now with OpenGL 4.1 (although there is always room for improvements). It can keep up with DirectX 11 in terms of capabilities. The next step necessary is improve OpenGL drivers, make sure they are on the same level as DirectX drivers. A conformance test definitively helps in that area I think.

The GLSL manpages are also very nice to have.

V-man
03-12-2011, 02:59 PM
They should finally clean up the extension registry.
Most of it is ancient.

Dark Photon
03-13-2011, 01:26 PM
They should finally clean up the extension registry.
Most of it is ancient.
Well, I actually find access to all the extension docs useful (and regularly search them). It's helpful to be able to go back and read about just the APIs/behavior that is associated with a particular feature or set of features, and the various discussion associated with it. Versus with the spec, only some of that is there, and what is there is mixed up with everything else to document the full API.

What I could see though (which I think may address your point) is having different web page views into the registry for different OpenGL spec versions which "hide" or display in a different color the extensions which are subsumed in that GL version. For instance, GL 3.0, 3.1, etc. views would not list (or would list in a different color) EXT_draw_instanced and EXT_framebuffer_object (for example) because that's in the spec as of GL 3.0. But you'd still have an "all extensions" view where you could go surf any/all extension specs if you want.

Something like a pickbox at the top of the www.opengl.org/registry (http://www.opengl.org/registry) page where you select the OpenGL spec version you're targeting, and it would change the color of all the subsumed extensions to green or a darker grey.

Seems this'd be pretty easy, and any of us could do this. But I suspect it's just never been important enough to any of us.

Groovounet
03-13-2011, 05:14 PM
I am also totally against "clean up" the extension registry.
I still used it regularly and actually I find even quite annoying that the old OpenGL specification (before 2.1) has been removed, as I still used them sometime (I saved a copy on my drive!)

So yes, the registry could be improved but removing something is not an improvement especially when we have already so few documentation.

V-man
03-14-2011, 09:01 PM
Sure, I agree.
Just simplify it since it is just a straight long list of everything from 199X until now.


Versus with the spec, only some of that is there, and what is there is mixed up with everything else to document the full API.
I find the man pages a little better than the spec (pdf) file.
http://www.opengl.org/sdk/docs/man3/

kRogue
03-14-2011, 11:54 PM
The older specs are still sitting around on OpenGL, for example:
http://www.opengl.org/documentation/specs/version2.0/glspec20.pdf

Gotta love wget.

ScottManDeath
03-14-2011, 11:56 PM
It would be quite helpful (and easy to do) to just chronoligally sort in the OpenGL specs into the extension list, e.g. making a head line. That way, that large wall of links is broken up into more managable junks and one can easily find extensions for each hardware generation.

Alfonse Reinheart
03-15-2011, 12:19 AM
It would be quite helpful (and easy to do) to just chronoligally sort in the OpenGL specs into the extension list, e.g. making a head line. That way, that large wall of links is broken up into more managable junks and one can easily find extensions for each hardware generation.

That would be nice if all the specifications fit into such a simple mold. Unfortunately, that is simply not possible. Not for all extensions, at any rate.

For example, take ARB_draw_buffers_blend. This extension was adopted into core OpenGL in 4.0. However, every piece of 3.x ATI hardware supports it (while no 3.x NVIDIA hardware does). So where do you file it? It would be misleading to call it purely a 4.0 feature, since it is not restricted to 4.0 hardware. But you can't call it a 3.x feature since it's not core in 3.x.

Then there are those extensions which have never been adopted into core, but are nonetheless widely available.

And then there are extensions that have essentially no hardware requirements. There's no reason you couldn't implement ARB_separate_program_objects on an old GeForceFX or Radeon 9500, for example. The same goes for ARB_sampler_objects or ARB_vertex_array_object. How do you define the "hardware generation" for extensions that are, for all intents and purposes, API fixes or improvements?

Ultimately, I think the registry is fine for what it is. It is a reference, and it should be treated as such. The problem is that reference materials should not be the only source of information about a particular feature. And for a great many extensions and core features, if you want to find decent info about how they work, the registry materials is all you have.

The man pages are a decent start, but they're ultimately not much different from specifications in their complexity. The man pages are categorized by function names; if you don't know what functions to look for, you're lost. This is fine for reference material, which is what it's for, but people learn best from actual documentation, not reference material.

Actual documentation is generally what the Wiki should be for. But the wiki is basically the wild wild west: some parts of it are good, some are incomplete, some are just random infodumps (good though the info may be, finding it is difficult if not impossible), and some of it is just out of date or flat out wrong.