Well, the new Newsletter is out. Some comments.
Async object creation: Sure, why not.
Removing the need to have a complete program in a program: I’m somewhat split on this. As long as it doesn’t create any deficiencies in terms of performance when using two programs compared to one, I’m fine. I’m just slightly concerned that it could cause a bunch of string compares to happen every time you bind multiple shaders for rendering. Assuming you still bind shaders to something in order to provoke rendering…
Image buffers: Like, what took you so long Does this mean that direct rendering to vertex buffers (aka, rendering to a renderbufferbuffer/texturebuffer can then be bound as a regular buffer object for vertex access) just works?
Uniforms grouped in blocks by the shader: Not entirely sure this is a good idea. My first question is this: is this something that’s going to require Uber-hardware, or can any hardware that can handle glslang currently do this?
Second question, is it necessary for the buffer object (possibly a new subtype?) to specify all of the parameters that the uniform block(s) in the shader uses?
Third, how would this impact performance?
On Longs Peak-to-GL2.1 integration: If you’re going to do the whole multiple-context thing, you definitely need to call it OpenGL 3.0 when it ships. Any other name, like OpenGL 2.2, would not properly show the depth of the changes involved.
As for the actual implementation of cross-usage, it seems fine. GL 2.1 should be able to use Longs Peak objects, but Longs Peak should not be able to use 2.1 stuff; that would just add clutter to an API that is supposed to be clean.
I am curious how you intend to implement the multi-context option, since the parameters for context creation are all pretty much fixed and un-upgradable. Or, at least, they are in Windows.
On template objects: Good name, BTW. The use of the command, “glCreateTemplate(GL_IMAGE_OBJECT)” to create the template object suggests that the result significantly depends on the parameter enum token passed into the function. Does this mean that if you create an image object template, for example, errors will occur if you start giving it attributes meant for, say, a framebuffer object?
As for more general questions, here’s one.
OK, so by now, we have a pretty good idea of what a number of different constructs in Longs Peak will look like compared to GL 2.1. What I want to know is what some of the more forgotten parts of GL will look like. Will there be objects for setting blend parameters or alpha testing? What about glViewport or glDepthRange? How do we go about setting those bits of data that don’t cleanly fit into known object types?