Scalar binary/intermediate shader code
I think its high time we get standard binary shader format. But it should be scalar and NOT vectored! Why scalar? Here are my reasons:
1) Lets define the term "general scalar" to be anything that can (relatively) easy be converted to and from simple scalar form, including any kind of instruction-level parallelism but excluding any kind of data-level parallelism.
As it turned out, "general scalar" GPU architectures are more efficient than vectored ones - they are able to utilize the hardware resources better. For this reason all major GPU architectures (since 10+ years now) are "general scalar". For them any vectored code is converted to their native "general scalar" code before it is executed. Thus vectored code only remains useful as a syntactical convenience, but this only applies to high-level languages intended to be used by people. The binary code in question is not intended to be used for directly writing shaders in it.
2) Converting code from vectored to scalar form is easy and incurs no code quality degradation. (In contrast, efficient conversion from scalar to vectored code is very hard problem.) This means a scalar binary code would not cause additional burden for the compilers. Actually its just the other way around because:
3) Scalar code is much easier for optimization algorithms to analyze and process it. This reason makes scalar ultimately better than vectored.
I have been watching how badly the microsoft's HLSL shader compiler performs. The code it generates is awful mainly because it has to deal with the extreme burden that is the vectored model.