driver version query

according to the spec glProgramBinary may fail if the implementation detects that the binary you supply is for different driver/hardware/who_knows_what

but there is the problem that the application does not have a way to know about driver versions. so far the opengl specification have never used or defined such notion.

please provide a new query e.g. glGetString(GL_IMPLEMENTAION_ID)
to return something that uniquely identifies the current driver + hardware + whatever_else_needed set.
and change the program binary specification to require exactly this id to be the same.
and please remove such vague things as hardware or driver versions from the specification.

this would aid the applications to implement shader caches based on the program binary functionality.

The version you have mentioned does not correspond to the actual driver’s version. It is a version of the binary shader. You need not to know the version of the driver. If the version is improper you won’t be able to use that binary shader.

Your life will not be easier if you know the version of the binary shader, since you cannot change GLSL implementation in the current driver.

Do you try to impose all possible implementation to be available in the single driver? Don’t you think it is an unexceptionable request?

Please read again what i have written.
From your reply it seems to me that you either did not read my suggestion careful enough or did not understand some parts.

The version you have mentioned does not correspond to the actual driver’s version.

I never said it corresponds to the actual driver version.

You need not to know the version of the driver.

I never said i want to know the driver version.

Your life will not be easier if you know the version of the binary shader, since you cannot change GLSL implementation in the current driver.

My life will be easier because this way i can easily know which of my files of compressed binaries is the right one even before i load them all from the disk, decompress them and for all of them try to upload one of the binaries to the GL.

Do you try to impose all possible implementation to be available in the single driver? Don’t you think it is an unexceptionable request? [/QUOTE]

what are you talking about?
As is it now the spec says ProgramBinary may fail if the hardware or software configuration is not the same as what it have been when GetProgramBinary was called.
My suggestion was to change this language to say that ProgramBinary may fail if “the configuration ID” is not the same as what is have been when GetProgramBinary have changed.
What is unclear here?

I think it is a good idea to add this query. Moreover, it might be even better to add another query to retrieve the “implementation id” of a binary shader. It makes developer life easier to know if the shader is compatible before loading it up.

I try to talk to AMD shader team to see what they think.

Thank you very much!

It sounds like your objective is to supply precompiled shaders along with your program installation exe. That’s not a good idea.

Or maybe to just skip loading shaders that will not work with a driver change. I.e. application on 1st run does the shader build thing, saves the binary images. Then user updates drivers, be nice to know if the application should bother using those saved binary shaders. Additionally, given that some power users change between drivers often, an application could save multiple binary shaders keyed by the driver version.

Power users change between drivers often? Application should save a binary shader for each version? Who are those freaks (power users).

most common scenario:

application A (usually games) works fine with driver version 1, but application B works fine with driver version 2… tinkerer’s will (un?)happily swap between multiple drivers for their favorite apps(usually games)… causes being from application was older and older drivers out of spec, etc… folks that jumped to “dx10” features via GL or extensions of GL a ways bsck for a period of time found that as drivers got better at conforming to the spec, often their apps and shaders needed tweaking… often enough a driver update can be a crapshoot, right? somethings go better some go worse…

Current features would allow to save a set of bin shaders for each run, then at each staturp, try first shader of latest run, if fails, try an older set, then an older set, etc, until either a valid bin set is found, or recompilation from source is needed.

I have no idea if this is costly or not ?

Well, presumably, if you have multiple sets of shaders, you would use one test shader from that set to see if it works. There’s obviously no need to load or test all of them if the test shader works.

BTW, has anyone really started using binary shaders? Does anyone keep track of which driver versions of shaders cause older shaders to become obsolete? It would be interesting to have such a database. Maybe IHVs don’t obsolete them very often. Obviously porting between hardware wouldn’t work, but if most driver updates don’t obsolete them, maybe it’s fine.

Current features would allow to save a set of bin shaders for each run, then at each staturp, try first shader of latest run, if fails, try an older set, then an older set, etc, until either a valid bin set is found, or recompilation from source is needed.

I’d say it is very likely true.