PDA

View Full Version : Status on GLSL object code readback?



MarcusL
10-24-2007, 07:25 AM
I'm just curious if anything is planned for the app to get the _object_ code for a shader and re-use it later. The ARB_shadre_object extension mentions this as future work, but nothing has happened.

I mean, it's quite simple to do. The GL can reject the binary source I got previously (new driver, different gpu, etc), and I'll just give it the source to allow for a fresh compile.

My app's startup takes 10s to load data and 16s to compile shaders and upload textures. Most of that is shader compilation (I have a lot of them).

It's quite silly to do the same work every time I start my app.

ZbuffeR
10-24-2007, 08:40 AM
There has been discussions about possibility of downloading/uploading compiled shaders from/to GPU. Apparently OpenglES does it.
I agree it seem the most sensible thing to do.
Much better in fact than trying to compile to a portable assembly language as some people suggested.

knackered
10-24-2007, 03:37 PM
This is from the OpenGL ES 2.0 specification document:


2.15.2 Shader Binaries
The ShaderBinary command can be used to load precompiled shader binaries.

void ShaderBinary(int n, const uint* shaders, enum binaryformat, const void* binary, int length)

This call takes a list of n shader handles described by shaders. Each shader handle refers to a unique shader
type i.e. a vertex shader or a fragment shader. The binary argument points to the pre-compiled binary code.
This provides the ability to individually load binary vertex, or fragment shaders or load an executable binary
that contains the optimized pair of vertex and fragment shaders stored in the same binary.

Since OpenGLES provides no specific binary formats, using the generic i.e. PLATFORM_BINARY format will result
in an INVALID_ENUM error. For all other binary formats, the binary image will be decoded according to the
specification defining the binaryformat token. A binary data that does not match the specified binaryformat
will result in an INVALID_VALUE error. The bits that represent the binary is implementation specific. If
ShaderBinary failed, GetError can be used to return the appropriate error. A failed binary load does not
restore the old state of shaders for which the binary was being loaded.
Note that if shader binary interfaces are supported, then an OpenGLES implementation may require that an
optimized pair of vertex and fragment shader binaries that were compiled together be specified to LinkProgram.
Not specifying an optimized pair may result in the LinkProgram call to fail.
So no mention of reasons why a binary may fail, it just leaves that up to the hardware. However, I can't for the life of me find any function in OpenGL ES that allows you to retrieve a shader binary after compilation/linking - so where are we supposed to get these binaries in the first place?

Komat
10-25-2007, 12:48 AM
Originally posted by knackered:
However, I can't for the life of me find any function in OpenGL ES that allows you to retrieve a shader binary after compilation/linking - so where are we supposed to get these binaries in the first place? I assume that it is up to vendor of specific OpenGL ES enabled device (e.g. phone) to create external GLSL compiler for its device if he desires to allow use of the ShaderBinary function.

MarcusL
10-29-2007, 07:55 AM
But for "regular" OpenGL, retreiving the compiled binary would make good sense.