PDA

View Full Version : using unsigned short in glsl compute shader



driver
01-29-2014, 11:20 AM
Hi

Is there a way to use unsigned shorts in a glsl compute shader? I noticed that the supported data types do not mention unsigned short - ony 32 bit types like int, float unsigned int.

I have a buffer of unsigned shorts that I am reading into the compute shader. I also write out to this buffer. If unsigned short is not supported, then I will have to
a) convert the unsigned short into floats before sending them to the compute shader
OR
b)Strip out the extra bits and store 2 bytes worth of data into the buffer inside the compute shader (NOT SURE IF THIS IS THE BEST APPROACH).

Any suggestions from previous experiences?

thanks

AHeumann
01-30-2014, 02:37 AM
The GL_NV_gpu_shader5 (http://www.opengl.org/registry/specs/NV/gpu_shader5.txt) extension added support for a full set of 8-, 16-, 32-, and 64-bit scalar and vector data types for all shader types.