Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 2 of 2

Thread: using unsigned short in glsl compute shader

  1. #1
    Intern Contributor
    Join Date
    Oct 2012
    Posts
    60

    using unsigned short in glsl compute shader

    Hi

    Is there a way to use unsigned shorts in a glsl compute shader? I noticed that the supported data types do not mention unsigned short - ony 32 bit types like int, float unsigned int.

    I have a buffer of unsigned shorts that I am reading into the compute shader. I also write out to this buffer. If unsigned short is not supported, then I will have to
    a) convert the unsigned short into floats before sending them to the compute shader
    OR
    b)Strip out the extra bits and store 2 bytes worth of data into the buffer inside the compute shader (NOT SURE IF THIS IS THE BEST APPROACH).

    Any suggestions from previous experiences?

    thanks

  2. #2
    Junior Member Newbie
    Join Date
    Jan 2013
    Posts
    11
    The GL_NV_gpu_shader5 extension added support for a full set of 8-, 16-, 32-, and 64-bit scalar and vector data types for all shader types.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •