Part of the Khronos Group

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 2 of 2

Thread: using unsigned short in glsl compute shader

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Intern Contributor
    Join Date
    Oct 2012

    using unsigned short in glsl compute shader


    Is there a way to use unsigned shorts in a glsl compute shader? I noticed that the supported data types do not mention unsigned short - ony 32 bit types like int, float unsigned int.

    I have a buffer of unsigned shorts that I am reading into the compute shader. I also write out to this buffer. If unsigned short is not supported, then I will have to
    a) convert the unsigned short into floats before sending them to the compute shader
    b)Strip out the extra bits and store 2 bytes worth of data into the buffer inside the compute shader (NOT SURE IF THIS IS THE BEST APPROACH).

    Any suggestions from previous experiences?


  2. #2
    Junior Member Newbie
    Join Date
    Jan 2013
    The GL_NV_gpu_shader5 extension added support for a full set of 8-, 16-, 32-, and 64-bit scalar and vector data types for all shader types.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts