ProgramBinary on Intel HD 4000

Hi!

I am trying to use GL_ARB_get_program_binary on my laptop Intel HD 4000. Linux Mint 18.2 (Ubuntu 16.04)

After installing the latest MESA I see the following glxinfo output (support of OpenGL 4.2 core):


    Vendor: Intel Open Source Technology Center (0x8086)
    Device: Mesa DRI Intel(R) Ivybridge Mobile  (0x166)
    Version: 17.4.0
    Accelerated: yes
    Video memory: 1536MB
    Unified memory: yes
    Preferred profile: core (0x1)
    Max core profile version: 4.2
    Max compat profile version: 3.0
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.0

But glGetIntegerv(GL_NUM_PROGRAM_BINARY_FORMATS, …) always return zero.
glGetProgramiv(program, GL_PROGRAM_BINARY_LENGTH, &binary_length) is also always zero.
Tried to call glProgramParameteri(id_, GL_PROGRAM_BINARY_RETRIEVABLE_HINT, GL_TRUE); after program creation, but still no luck.

Tried to run gl-410-porgram-binary from ogl-samples-pack - all the same, GL_NUM_PROGRAM_BINARY_FORMATS is zero.

So, is this behavior expected on Intel HD 4000, is this a well known issue? Or am I doing something wrong?

GL_ARB_get_program_binary extension specification.

In the section entitled “New State”:

    (table 6.51, Implementation Dependent Values) add the following:

    Get Value                       Type    Get Command          Minimum Value  Description                        Section
    -------------                   ----    -----------          -------------  -----------                        -------
    PROGRAM_BINARY_FORMATS          0* x Z  GetIntegerv          N/A            Enumerated program binary formats  2.14.3
    NUM_PROGRAM_BINARY_FORMATS      Z       GetIntegerv          0              Number of program binary formats   2.14.3

Zero is therefore a legal value for GL_NUM_PROGRAM_BINARY_FORMATS.

It is legal value, but it is zero and this confuses me because the supported version of OpenGL is 4.2, so I expected binaries to work. Is this normal, I mean is this a well known behavior for Intel HD 4000?

[QUOTE=mhagain;1289489]GL_ARB_get_program_binary extension specification.

In the section entitled “New State”:

    (table 6.51, Implementation Dependent Values) add the following:

    Get Value                       Type    Get Command          Minimum Value  Description                        Section
    -------------                   ----    -----------          -------------  -----------                        -------
    PROGRAM_BINARY_FORMATS          0* x Z  GetIntegerv          N/A            Enumerated program binary formats  2.14.3
    NUM_PROGRAM_BINARY_FORMATS      Z       GetIntegerv          0              Number of program binary formats   2.14.3

Zero is therefore a legal value for GL_NUM_PROGRAM_BINARY_FORMATS.[/QUOTE]

It’s not known behaviour for that particular hardware so far as I’m aware, but it is allowed behaviour for any hardware or driver.

OpenGL has several weird behaviours like this (another example is occlusion queries, where the number of query counter bits is allowed to be 0) and responsibility lies with you, as the programmer, to be aware of them and adapt your code to them.

Just to be clear: the behaviour you are observing with the Intel is perfectly legal and conformant behaviour. And the Intel might not be the only GPU it happens on, so if you ever release your program you will have to be aware of the fact that supporting 0 binary formats is legal behaviour for any GPU, and make decisions around that.

So in summary, no: you haven’t discovered a driver bug and you haven’t discovered behaviour specific to the Intel 4000. What you have discovered is behaviour allowed by the GL spec that you weren’t previously aware of, and that you need to correct your own code to allow for it.

[QUOTE=mhagain;1289493]It’s not known behaviour for that particular hardware so far as I’m aware, but it is allowed behaviour for any hardware or driver.

OpenGL has several weird behaviours like this (another example is occlusion queries, where the number of query counter bits is allowed to be 0) and responsibility lies with you, as the programmer, to be aware of them and adapt your code to them.

Just to be clear: the behaviour you are observing with the Intel is perfectly legal and conformant behaviour. And the Intel might not be the only GPU it happens on, so if you ever release your program you will have to be aware of the fact that supporting 0 binary formats is legal behaviour for any GPU, and make decisions around that.

So in summary, no: you haven’t discovered a driver bug and you haven’t discovered behaviour specific to the Intel 4000. What you have discovered is behaviour allowed by the GL spec that you weren’t previously aware of, and that you need to correct your own code to allow for it.[/QUOTE]

Ok, got it. Thx!

[QUOTE=Alexey_Romanov_AR5;1289488]GL_ARB_get_program_binary on my laptop…
Intel HD 4000. Linux Mint 18.2 (Ubuntu 16.04)…
Mesa DRI…

But glGetIntegerv(GL_NUM_PROGRAM_BINARY_FORMATS, …) always return zero. [/QUOTE]

Interesting. This is a function of the GL drivers, not the hardware.

If you mine this table here:

[ul]
[li]GL_NUM_PROGRAM_BINARY_FORMATS report (opengl.gpuinfo.org) [/li][/ul]
You’ll find that:

[ul]
[li]all of the genuine Intel drivers for Intel GPUs report GL_NUM_PROGRAM_BINARY_FORMATS == 1, but [/li][li]all of the Mesa3D drivers for Intel GPUs report GL_NUM_PROGRAM_BINARY_FORMATS == 0. [/li][/ul]
So this num formats == 0 thing seems to be a Mesa3D quirk.

If you want to use this GL extension, you probably want to buy a GPU with GL drivers that support it under Linux. See that table for many examples.