Originally posted by Pentagram:
[b]
myCube = RenderCubeMap(worldSpaceOrigin)
Problem with adding this is that these sort of things would be callbacks to the user using the system. This would reqire every user to implements lots off callbacks to support all these. (Well I guess you could cut on callbacks like only “RenderSceneForFrustrum” and just call this 6 times for a cubemap, etc… still it would require some extra “meta” state, say you want to render to a sniper scope with some “heat” effect, this would require all shaders to be overriden by the heat effect, it’s not really clear how this should be done to me… you could add techniques named blahblah_Heat to every effect or something similar but that’s a bit hacky)
[/b]
yeah, I think this are the non-trivial things to get right and general usable.
My (not yet finished) approach is to have meta states / equivalent classes for vertex attributes, uniforms and textures. The nice thing is that you have also a more loose coupling between effect description, material and render object description.
Each meta state consists only of a name and data type info. Meta states can be registered/created at runtime.
Then I have at the bottom the rendertechnique description with general render state settings and meta states for required parameters of the technique.
On top of that I have a material defining some of the meta states (like textures and some uniforms).
And on top of that I have a MaterialBinding for mapping vertex attribute meta states to concrete render state objects to bind the vertex data of the mesh.
rendertechnique, material and vertexdata/mesh are data driven, materialbinding is generated.
For all other required but yet undefined meta states after materialbinding you can register callbacks or otherwise the object cannot be rendered using the material.
Well, the callbacks and undefined meta state handling is still implementation phase but I think there are some nice things possible, like callbacks or provide convertion between meta states and so on.
I think you really have to provide some type of callback or other feedback coupling to the rest of the render system. But with template policies and a collection of predefined helper methods hopefully that would be possible in a generic way.
And with the heat shader example. I think if you impose some restriction to the shaders like this structure:
This is somewhat more generic, but still requires the engine to recognize “ShadowTechnique”.
yes, but you need anyway a feedback binding to the render system. I only think it should be minimal and easy to plugin.
And with something like renderScene(Material*, Frustum*) as you already mentioned the most important cases like shadow maps or reflections can be handled.
Hmmz thinking about the assign to textures thing, it could be handy at other levels too probably, allowing generic expressions on textures, like if they are in the global state they could just be done at load time, allowing you to mix two textures, generate normals from bumps, … all with only a load time overhead.
What do you mean by this? Assign metas states to textures?