PDA

View Full Version : Shader and program binary loading



Victor Zamanian
12-12-2011, 02:11 AM
Hello!

As far as I understand, one can write compiled and linked binary objects to a file so that they may be cached and loaded at a later time. This can be done both for separate, compiled shader objects and complete, linked program objects. The OpenGL extension for this feature is GL_ARB_get_program_binary. (Please correct me on any and all details here.)

My issue is concerned with querying whether or not my OpenGL context/GPU supports these features. According to the OpenGL specification Version 4.1 (Core) -- I am unable to create a context higher than this at the moment because of a bug in the nvidia drivers -- on page 53, section 2.11 "Vertex Shaders", it says:

"A GL implementation must support shader compilation (the boolean value [GL_]SHADER_COMPILER must be [GL_]TRUE). If the integer value of [GL_]NUM_SHADER_BINARY_FORMATS is greater than zero, then shader binary loading is supported."

Some Googling lead me to believe that the integer value of GL_NUM_PROGRAM_BINARY_FORMATS (note, "PROGRAM" instead of "SHADER") is the query token to see if program binary loading is supported.

I wrote some code to query these values after context (4.1) creation and after GLEW initialization.



GLboolean shader_compiler;
glGetBooleanv(GL_SHADER_COMPILER, &shader_compiler);
printf("GL_SHADER_COMPILER: %d\n", shader_compiler);
if (shader_compiler == GL_TRUE) {
puts("shader compilation supported");
} else if (shader_compiler == GL_FALSE) {
puts("shader compilation not supported");
}

GLint program_formats = 0;
glGetIntegerv(GL_NUM_PROGRAM_BINARY_FORMATS, &program_formats);
printf("GL_NUM_PROGRAM_BINARY_FORMATS: %d\n", program_formats);
if (program_formats > 0) {
puts("program binary loading supported");
} else {
puts("program binary loading not supported");
}

GLint shader_formats = 0;
glGetIntegerv(GL_NUM_SHADER_BINARY_FORMATS, &shader_formats);
printf("GL_NUM_SHADER_BINARY_FORMATS: %d\n", shader_formats);
if (shader_formats > 0) {
puts("shader binary loading supported");
} else {
puts("shader binary loading not supported");
}


Output for the code above is:



GL_SHADER_COMPILER: 1
shader compilation supported
GL_NUM_PROGRAM_BINARY_FORMATS: 1
program binary loading supported
GL_NUM_SHADER_BINARY_FORMATS: 0
shader binary loading not supported


How come shader binary loading is not supported while program binary loading is? I've been Googling this for about 3 days now and I am none the wiser. Is there something I need to do to enable it?
I run Ubuntu 11.10 The GPU is an Nvidia GeForce GTX 580 with driver version 290.10 I use GLFW to open a window with a context; installed version is 2.7.2 GLEW version is 1.7.0.
Thank you for your time!

Ffelagund
01-06-2012, 05:39 PM
Hi,

I'm a little confused on this topic too. Reading the specs I'd found a way to RETRIEVE/LOAD PROGRAM binaries but I'd only found a way to LOAD SHADER binaries. It's expected that the way of retrieving shader's binaries would be provides by an extension? It seems weird to me that the specs dont make a single mention of how can I get those binaries, but they speak about how to load them.

PD: my intention is do not try to canibalize your post. I have the same questions than you and I think it's better write my thoughts there than creating a new post.

Aleksandar
01-09-2012, 11:09 AM
How come shader binary loading is not supported while program binary loading is?
It's quite normal since binary shaders are part of completely different extension - ARB_ES2_compatibility. ;)


I'm a little confused on this topic too. Reading the specs I'd found a way to RETRIEVE/LOAD PROGRAM binaries but I'd only found a way to LOAD SHADER binaries. It's expected that the way of retrieving shader's binaries would be provides by an extension? It seems weird to me that the specs dont make a single mention of how can I get those binaries, but they speak about how to load them.

Well, yes, the specification is not the document one should start with. :)

The section regarding binary shaders is very confusing. Of course that there are no words about making binary shaders since it is a task for an external shader compiler. Binary shaders are part of OpenGL ES 2.0. They are included in GL4+ in order to alleviate porting applications from ES2 to desktop GL.


Is there something I need to do to enable it?
AFAIK, there is no such switch. Why do you need that? Even binary programs are not convenient to support on different platforms/drivers.

Ffelagund
01-09-2012, 12:04 PM
Well, the specs are the first doc I start with when I try to learn a new feature (specially when its a really new feature and there still doesnt exists online tutorials), then I search for tutorials or examples wich can clarify me some things.

But you are right, sometimes the specs arent the best place to start, specially with some obscure or comolex features.

In this case all pointed that if I can load shaders and load/unload the online compiler, I could (as well i can do the same with program binaries) produce shader binaries, but in the end, I cant, and there is no info about how I could.

Well, for me its enough with program binaries, but this is one of those OpenGL things I don't agree about how it was though, or at least, explained.

Aleksandar
01-09-2012, 12:45 PM
In this case all pointed that if I can load shaders and load/unload the online compiler, I could (as well i can do the same with program binaries) produce shader binaries, but in the end, I cant, and there is no info about how I could.

Take a look at some GL ES 2.0 book or tutorial. ES deals with binary shaders quite frequently.

Also, I would recommend using extension registry for learning new features rather than specification itself.