View Full Version : Shader compilation error : 'unsigned' reserved word

06-18-2014, 02:41 AM

I wrote a shader in version 110.
I changed an uniform variable from int to unsigned int.
I had no problem on my computer with nvidia card.
But on another computer with ATI X1300/1550 I have the compilation error : 'unsigned' reserved word.

I saw that "unsigned" is added since GLSL version 130.
So I set #version 130 in my shader.
But this changes nothing...

It is funny that things that are not in the specifs work on nvidia.
But it is not funny that things that may work do not work on ATI.
With glewinfo I saw that using glUnifor1ui is possible on this ATI with opengl 3.0 (ie GLSL 130).

Why does it not work on ATI ?
Is my "#version 130" not enough ?

I can use an int like now but I would like to understand...


Agent D
06-18-2014, 04:08 AM
See here (http://www.opengl.org/wiki/Data_Type_%28GLSL%29) for GLSL data types. It is called "uint", not "unsigned int".

06-18-2014, 04:55 AM
The general rule for desktop OpenGL is:

Things work on NVIDIA that shouldn't.
Things don't work on AMD that should.
Intel is a jungle.

You really need to test on all 3 vendors and you can assume nothing. In many cases it comes down to a choice between writing 3 separate vendor-specific code paths or finding the minimum that works consistently on all 3. What a mess.

06-18-2014, 05:18 AM
I used uint ; "unsigned" is in the error message.

So bad news if this is not a beginner mistake with #version or something else... :(

Thanks for your quick answers.