Problem with "layout" Syntax

Hello,

I am currently trying to use the layout-syntax in combination with “in” and “uniform” variables in my vertex shader. Here is my vertex-shader:

#version 140

layout(location = 0) in vec4 pos;
layout(location = 0) uniform vec2 offset;

void main() {
	pos.x += offset.x;
	pos.y += offset.y;
	gl_Position = pos;
}

When I am trying to compile this I get an compiler error and printing the log gives me the following message:
ERROR: 0:3: ‘location’ : syntax error syntax error

Can someone tell me what the problem with my shader is? My system supports (according to glGet(GL_VERSION)) version 3.3.0 - Build 8.15.10.2696.
Thank you,
GreenOwl

GL_ARB_explicit_attrib_location and GL_ARB_explicit_uniform_location are only available in GL 3.3 and 4.3 respectively or if the extension is supported, which is at least the case on my current machine using a GTX285 (GL3.3) and latest driver.

Anyway, if you want to use the new functionalities introduced by the extensions and still want t use GLSL 1.40 conforming shaders, you need to explicitly enable (or even require) the extensions in your code:


#version 140
#extension GL_ARB_explicit_attrib_location : require
#extension GL_ARB_explicit_uniform_location : require

// your code here

This should do the trick. If you go like this, you can drop one extension directive:


#version 330
#extension GL_ARB_explicit_uniform_location : require
// your code here

Thank you for your answer!

I tried to use your second solution but it didnt work. Here is the error message:

ERROR: 0:2: '' :  extension 'GL_ARB_explicit_uniform_location' is not supported
ERROR: 0:5: 'uniform' : syntax error syntax error

Is OpenGL version 4.3 required to use

layout(location = value)

without any extensions? Why do I get the syntax-error on the uniform-expression?
Thank you,
GreenOwl

As I said, if your implementation (i.e. the GL implementation that comes with your driver) does not expose GL_ARB_explicit_uniform_location your out of luck. What GPU do you use? Judging from the compiler errors this looks an awful lot like an Intel GPU. Sandy Bridge or Ivy Bridge?

EDIT: Forgot your question. Yes, a GLSL 430 compiler is required to handle explicit uniform locations without enabling the corresponding extension. You get a compile time error because the extension is not supported at all, neither as core functionality, nor as separate extension, and thus cannot be explicitly enabled like you tried above.

Yes I have an Intel HD4000 (integrated?) GPU on my system.

Is OpenGL version 4.3 required to use layout(location = value) without any extensions?

I am still not completly sure whether this is the case or not. On my home PC with a ATI HD 5xxx GPU the shader works without any declaration of extensions.
Is it necessary to write two implementations of this shader (one with the location and one without) and choose on the client-side which to use to ensure that it works on both of my systems?

Thank you,
GreenOwl

Some implementations don’t take it that strict whether they allow you to use a feature without the declaration of the extension or not. However, if you want your application to run on all proper implementations you have to do the following:

  1. Use GLSL 4.30 shaders which have them in core, but are supported only if OpenGL 4.3 is supported by your driver:
#version 430

layout(location = 0) in vec4 pos;
layout(location = 0) uniform vec2 offset;
  1. Use GLSL 3.30 shaders which have explicit attribute locations in core (but not explicit uniform locations), requires OpenGL 3.3:
#version 330
#extension GL_ARB_explicit_uniform_location : require

layout(location = 0) in vec4 pos;
layout(location = 0) uniform vec2 offset;
  1. Use an older shading language but then you require both extension enables:
#version 140
#extension GL_ARB_explicit_attribute_location : require
#extension GL_ARB_explicit_uniform_location : require

layout(location = 0) in vec4 pos;
layout(location = 0) uniform vec2 offset;

Unfortunately, for now you’ll have to stick with using non-explicit locations for uniforms until all vendors support them.

Also, please don’t confuse things. There is no generic “layout(location)” feature. There are multiple of them. One that applies only to vertex inputs and fragment shader outputs, one that applies for varyings and one that applies for uniforms. All of these were introduced in separate versions of OpenGL and GLSL thus having one working doesn’t mean that they all will work.

Thank you for your reply.

I am not competely sure if I am interpreting your post the right way:
You suggest to use non-explicit locations (via glBindAttribLocation) until at some point in the future the location-sytax (for in and uniform) is standard? Or should I implement shaders (1) - (3) and choose on client-side which to use? But then again I would have to implement another shade (4) using none of the location-syntax (for in and uniform) for my old system?

Thank you,
GreenOwl

PS.: Please excuse my bad spelling and grammar. I am not a native speaker.

First of all, the HD4000 GPU is integrated in an Ivy bride CPU which supports OpenGL 4.0 on Windows so you should definitely update your drivers. Since I don’t have an Ivy Bridge at hand anywhere, you can quickly check if the extension is supported using the GPU caps viewer with the latest driver.

I am still not completly sure whether this is the case or not. On my home PC with a ATI HD 5xxx GPU the shader works without any declaration of extensions.

Well, as far as I read the GLSL spec it should spit out a compile time error. The spec states:

and furthermore

First of all, the former statement is simply horribly formulated and it seems to directly contradict the GL spec which states:

If this the case, then the GLSL compiler of the implementation is also required to handle any other version down to 1.40.

Now the way I read it both statements, what’s important is that if you specify a lower version, the language accepted has to be supported by that particular version. GLSL 1.40 definitely does not handle explicit uniform locations and attrib locations without enabling the extension. So, I suspect that successfully compiling a shader which uses a feature that is not part of the language version specified in the shader, is simply incorrect and an implementation bug. A compile-time error should be thrown IMHO.

I’d like to hear the others on this though. I have been wrong before. :slight_smile: