OpenGL at GDC

Interesting program at GDC.

http://www.khronos.org/news/events/detail/gdc-2011-san-francisco

So WebGL 1.0 to be announced. What about OpenGL? According to the update rate of OpenGL specification, we should have OpenGL 4.2 release… I remains unsure about that but after all I had doubt at Siggraph 2010 and we got OpenGL 4.1.

What do you think:

  • Do we need OpenGL 4.2?
  • Do we need an update of OpenGL 4.1 that fixed the specification bugs
  • Do we need OpenGL to be more stable?
  • Do we need OpenGL 4.2?

Yes. There’s functionality we should have that we don’t.

  • Do we need an update of OpenGL 4.1 that fixed the specification bugs

I’m not sure what bugs in particular you’re referring to, but bug fixes should not be part of any timetable. They should just update the spec with fixes whenever they have a fix.

  • Do we need OpenGL to be more stable?

What do you mean by that? The spec being more stable, or implementations?

Lets take it more steps further and say OpenGL 4.5 maybe?

Conformance test certification?

Conformance test… at every BOF we have the same “This time it’s about to be done”.

There are some tools that should be supported like Glean https://sourceforge.net/projects/glean/ and dEQP http://www.drawelements.com/

For stable I meant specification, it keeps changing and follow it is just good for few people really into it. The (many) others are like “what?” and barely goes beyond shaders. To generalize modern OpenGL programming, I think we need more stability but better documentation, better drivers and better programmers (we need to learn too!)

Then OpenGL doesn’t expose everything so I think it needs to continue to evolve but in a way that the paradigm doesn’t drastically after a specification update. OpenGL 4.1 as basically no OpenGL 4 hardware features but the API and its use as changed a lot with the separate program stuff.

Glean https://sourceforge.net/projects/glean/

Going solely by the documentation on that page, Glean is not a project that should be “supported” in any way. They can’t even be bothered to make actual releases, forcing you instead to download from CVS. Their documentation is ancient, asking MSVC users to use STLPort (which hasn’t been updated in 2.5 years. And we’re long past the days when VC++'s STL implementation had problems), needlessly requires a simple external library that could have been included in the distro (libtiff), etc.

This feels, in every way, like the worst aspects of Open Source development. That UNIX-ian philosophy of make everything as needlessly difficult for the user as possible.

I’d much rather the ARB hire someone to just do it the right way.

For stable I meant specification, it keeps changing and follow it is just good for few people really into it. The (many) others are like “what?” and barely goes beyond shaders. To generalize modern OpenGL programming, I think we need more stability but better documentation, better drivers and better programmers (we need to learn too!)

Of course the spec keeps changing; they keep adding new things. Do you want them to stop?

Also, the ARB has no control whatsoever on any of the three points you mention. Well, they do have the “man” pages, but that’s reference documentation and doesn’t always tell the whole story.

Then OpenGL doesn’t expose everything so I think it needs to continue to evolve but in a way that the paradigm doesn’t drastically after a specification update. OpenGL 4.1 as basically no OpenGL 4 hardware features but the API and its use as changed a lot with the separate program stuff.

The paradigm didn’t change at all for 4.1. If you wanted to use program separation, it was there. But it’s not like they were saying you have to use it.

Also, that’s what you have to do when you’re playing catch-up because you’ve consistently failed for a good 3-4 years to get important changes implemented in the spec. By all rights, ARB_separate_program_objects shouldn’t even be necessary; that should have been taken care of when everyone hated having to link their program back before GL 2.0. But no; the all-knowing ARB decided to stick with 3DLabs’s idiotic linking paradigm. The same goes with things like sampler_objects, DSA (this should have been done back when we saw how badly multitexture, itself already needlessly convoluted, interacted with shaders) and the like.

The ARB made many such mistakes. And they still haven’t corrected all of them. And looking at the ARB_separate_program_objects spec, you can see that they’ve made more.

Until they do, there will be many such “paradigm shifts”. Or would you rather not have ARB_separate_program_objects at all?

All tests from Glean were imported to Piglit, a relatively new project which contains thousands of OpenGL tests. There are also a lot of GL3.x tests like ones for EXT_explicit_attrib_location, instancing, GLSL1.3, to name a few. Most of GL3 texture formats are thoroughly tested, i.e. there are tests for texture wrapping, mipmapping, render-to-texture, and so and so forth. It also contains tests for non-core extensions like EXT_separate_shader_objects, EXT_texture_compression_latc, ARB_shader_stencil_export, AMD_conservative_depth etc. Most tests read back color/depth/stencil values and compare them with the reference values they have. Also, OpenGL ES tests are being worked on. Piglit is huge and gets bigger and bigger.

http://cgit.freedesktop.org/piglit/
(there is no official release nor a web page, just the repository)

The problem with conformance testing is that you would need millions of tests to cover whole OpenGL. It’s impossible to have that without having a group of people working on it full-time for a couple of years at least. I don’t think the ARB have enough resources to pay for developing such a project. Supporting and extending existing projects like Piglit may end up being considerably cheaper, since there already is a commercial interest from some companies to develop Piglit.

When the ARB say “This time it’s about to be done”, they certainly lie. It will never be done.

You don’t need a million tests.

There are implementations that can’t even render wireframe mode properly. I have seen ATI drivers that render wires all over the place under certain conditions.

Perhaps conformance tests should be published. I’ve never seen one. When you download a driver, it never mentions passing any tests and which one’s it has failed. We get to discover them as we go along.

It has been done for OpenGL ES and WebGL so I don’t see why it would not be done for OpenGL. The pass has shown that it hasn’t been yet and probably WebGL and OpenGL ES receive more interest by the industry than OpenGL but still, why not.

I would quite enjoy to have a look at qEQP.

EDIT: Do you know more tools doing implementation quality check?

Just Mesa-demos, which only contain interactive rendering tests, but that is being ported to Piglit and automatized as well. One of ARB members told me that there were some non-public conformance tests for OpenGL 1.2. I guess that doesn’t count because nobody any longer cares about such an old API, right? I guess there is no such thing like conformant OpenGL anymore, nowadays it’s more about de-facto.

That depends on what you consider “conformant”. If it means GL functions should not report GL_INVALID_VALUE on valid calls, that’s pretty useless. If it means that some complex rendering should be done with various GL features and interactions between them, using as much of the OpenGL pipeline as possible, and comparing the final contents of colorbuffers and a depth-stencil buffer with reference values (with some little inaccuracies allowed), then you will need a fair amount of tests for it to be useful.

It’s not just wireframe that’s sometimes broken. If you use ARB_texture_swizzle with some particular formats on some particular drivers, you will notice that the channels are swizzled incorrectly. That’s just an interaction of 2 features: the texture format and texture swizzle, yet still some drivers aren’t doing it right. (this and other bugs in proprietary drivers have been automatically discovered by Piglit) Another thing is that indexing arrays in ARB_vp with a negative offset (as in “array[x-2]”) isn’t working on some drivers either, even though the spec allows it. And I could go on.

Sorry for the bit of necromancy, but GDC is over now. There was a presentation made about OpenGL. Does anyone know anything more than that?

I’m also interested in what was happened.

The new revision of OpenGL is not released for sure. Maybe your critique of inconsistency and immaturity of previous releases make them more careful. :wink:

It looks like we don’t get a new OpenGL version yet, even though there are some interesting extensions that would clearly make a nice addition to core. I hope we get some news from Khronos about their plans with OpenGL: the consistent releases of the past two years raised the expectations for another release during the GDC.

The GDC presentations were focussed on OpenGL 4.1, plus the release of the GLSL man pages (thanks to Graham!). The slides should be up shortly.

As I’ve consistently said, the ARB is both feature and schedule driven. We are definitely working on the next version of OGL, and I think you’ll like it when it’ll be released.

Barthold
(With my ARB hat on)

Available:
http://www.khronos.org/developers/library

I guess we could have had waited until Siggraph 2012 for “a Sentimental Mood” and other “Love story”. I always look think with skepticisum when the topic goes to either commemoration or congratulation leaving behind real new content, ideas and novelties.

Then, I wasn’t expecting much including OpenGL 4.2. However, if we are going to like the next version of OpenGL, why not giving some clues about the topic of interest at GDC?

In any case a big thanks to Graham who has been the best OpenGL ecosystem contributor during the past few years with the OpenGL 3.3 and 4.1 man pages, the OpenGL Superbible 5th edition and now the GLSL man page!

The biggest news for me is the dawn of Conformance Test Suite for OpenGL. It is definitely a long term goal. We should all benefit from it’s existence. Of course it all depends how they will maintain it. If they are going to add tests for bugs discovered by OpenGL community. If they are going to put tests for older bugs to avoid regressions.

I always look think with skepticisum when the topic goes to either commemoration or congratulation leaving behind real new content, ideas and novelties.

To be fair, looking at the WebGL presentations, they were all mostly fluff. So it’s not like OpenGL was short-changed in that department.

It’s surprising that the Game Developer’s Conference was used for marketing fluff. Game developers need to know what this stuff is and how to use it. Game developers don’t need buzzwords and nonsense; they need real, factual data about what it is they’re expected to use.

It’s sad to see Khronos so out of touch with some of its users in this respect.

Yes, there was a time when new tech demos were presented and explained using OpenGL. Now its all D3D11.

I still hope that the future of OpenGL will still be in high-end desktop/workstation graphics and not just in the mobile/low-end market as it comes across from the presentations.

Nvidia has so many OpenGL-people on board, why not do some exciting tech-demos using OpenGL again?

There was even a time when Windows Vista was released and D3D10 was basically unused. If OpenGL had been redesigned from scratch back then, we would be in a better position now. It was a great opportunity to fill the void with good stuff. Now it’s too late. D3D11 is too strong and the API is clean and nice, which can’t be said about OpenGL.

I don’t believe both major hardware vendors will cooperate to make a comprehensive set of conformance tests. One alternative is to use Piglit, which looks like this. It’s command-line only, but can generate an HTML report. The project clearly shows that the drivers from both hardware vendors are broken and in different ways, and it’s easy to crash them when doing even simple things.

I don’t believe both major hardware vendors will cooperate to make a comprehensive set of conformance tests.

But they’re not involved in making the conformance tests. Khronos is funding the development themselves.