Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 4 of 4

Thread: Fragment shader for unsigned integer textures

  1. #1
    Junior Member Regular Contributor
    Join Date
    Jun 2012
    Posts
    190

    Fragment shader for unsigned integer textures

    I am using following shader for unsigned integer textures to read a data:
    Fragment shader:

    Code :
    out uvec4 fragColor;
    uniform uint factor;
    void main()
    {
    uint temp=factor;
    temp=temp/2;
    fragColor = uvec4(temp,temp,temp,temp);
    }

    But i am getting error on driver A:
    "Compile failed.
    ERROR: 0:7: '/' : Wrong operand types. No operation '/' exists that takes a left-hand operand of type 'uint' and a right operand of type 'const int' (and there is no acceptable conversion)
    ERROR: 1 compilation errors. No code generated."

    on AMD it runs perfectly.. Is driver A is buggy?

  2. #2
    Junior Member Newbie
    Join Date
    Jan 2011
    Posts
    9
    Instead
    Code :
    temp=temp/2;
    use
    Code :
    temp=temp/2u;
    and always keep your eye on signed/unsigned operands: paragraph 4.1.10 "Implicit conversions" of GLSL spec says:
    There are no implicit conversions between signed and unsigned integers.
    Last edited by nuclear_bro; 02-22-2013 at 05:58 AM.

  3. #3
    Member Regular Contributor malexander's Avatar
    Join Date
    Aug 2009
    Location
    Ontario
    Posts
    302
    and always keep your eye on signed/unsigned operands: paragraph 4.1.10 "Implicit conversions" of GLSL spec says: "There are no implicit conversions between signed and unsigned integers"
    This is true of the GLSL specs earlier than 4.0. GLSL 4.0 added an implicit conversion from signed to unsigned. AMD seems to use the same compiler for all shader versions when it comes to syntax and conversions, so you get these more advanced features whether you want them or not (another example is a swizzle on a float). Nvidia's GLSL compiler is more strict in this regard, so unless you specify #version 400, implicit int->uint conversions are not supported.

  4. #4
    Junior Member Regular Contributor
    Join Date
    Jun 2012
    Posts
    190
    Thanks, it got worked..

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •