GLSL 1.50 specification not matching OpenGL 3.2

Hi there, I have found something like this:

I can create VBO with columns types whatever I want, for eg.
UBYTE3, but in GLSL 1.50 when I am defining input variables for VS the only integer types that are possible are INT or UINT (and in GL3.2 there are BYTE,UBYTE,SHORT,USHORT also).

Am I missing something or these two spec’s aren’t covering themselves fully?

See more at:
glspec32.core.20090803.pdf - page 28
GLSLangSpec.1.50.09.pdf - page 17 (23/125)

As far as I know, your VBO types are internal formats, while in GLSL you specify the format you wish to operate with.
There is a translation operation somewhere inside (need to dig the spec more, I guess).

All attributes are automatically (and at no performance loss) converted from the in-memory representation (described by your call to gl*Pointer) to the representation used by the vertex program.

In short: don’t worry about it. If you use floats in GLSL, you’ll get floats.

So I can send UBYTE3 column to save VRAM and it will be extracted to UINT3 in VS on the fly?

So I can send UBYTE3 column to save VRAM and it will be extracted to UINT3 in VS on the fly?

Yes. However, you have to use glVertexAttribIPointer if the attribute is actually an integer.