Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 2 12 LastLast
Results 1 to 10 of 69

Thread: GLSL noise fail? Not necessarily!

Hybrid View

  1. #1
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    GLSL noise fail? Not necessarily!

    I read this somewhat disheartening summary in the slides from the recent GDC presentation by Bill Licea-Kane:

    "Noise - Fail!"

    However, that does not need to be the case any longer. Recent development by Ian McEwan at Ashima Art has given us a new take on hardware friendly noise in GLSL:

    https://github.com/ashima/webgl-noise

    It might not seem like much, but his algorithm has all the hardware-friendly properties you want, some of which my old GLSL simplex noise demo was missing. In summary, it's fast, it's a simple include (no dependencies on texture data or uniform arrays), it runs in GLSL 1.20 and up (OpenGL 2.1, WebGL) and it scales well to a massively parallel execution because there are no memory access bottlenecks.

    Concerning this, I would like to get in touch with some people in the Khronos GLSL workgroup. I was last involved in this around 2003-2004, and my contact list is badly outdated. Are any of the good people in the GLSL WG reading this? My email address is "stegu@itn.liu.se", if you want to keep this private. Just please respond, as I think this is great news.

    /Stefan Gustavson

  2. #2
    Junior Member Regular Contributor
    Join Date
    Mar 2009
    Posts
    152

    Re: GLSL noise fail? Not necessarily!

    Great stuff. I will try it. Thanks for the info.

  3. #3
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    For those who want an easy to run demo:

    http://www.itn.liu.se/~stegu/simplex...ise-ashima.zip

    With the default window size, the sphere covers about 70K
    pixels, so multiply the frame rate with 70,000 to get the
    number of noise samples per second.
    On my ATI Radeon HD 4850, I get 5700 FPS, which translates
    to about 400 Msamples/s. Whee!

    Windows and Linux compatible source code. (Untested on
    Linux, but it should compile and run without changes.)
    Windows binary (.exe) supplied for your convenience.
    Uses only OpenGL 2.1 and GLSL 1.20, so it should compile
    under MacOS X 10.5 as well, if you either run it from the
    command line or create an application bundle and change the
    file paths for the shader files to point to the right place,
    e.g. "../../../GLSL-ashimanoise.frag" instead of
    "GLSL-ashimanoise.frag". You also need the library GLFW
    to compile the demo yourself (see www.glfw.org).

  4. #4
    Member Regular Contributor trinitrotoluene's Avatar
    Join Date
    Sep 2008
    Location
    Montérégie,Québec
    Posts
    362

    Re: GLSL noise fail? Not necessarily!

    I tried the Windows binary on Linux through Wine and it run great. The only "problem" is that the compiler issue a warning for the fragment shader: WARNING: 0:252: warning(#288) Divide by zero error during constant folding.

    Code :
     vec4 ip = 1.0 / vec4(pParam.w*pParam.w*pParam.w, 
                           pParam.w*pParam.w, 
                           pParam.w,0.);

    Of course, changing the last parameter of vec4 to any number than 0 remove the warning and don't modify the noise texture on the sphere.

  5. #5
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    Good catch. Of course the 0. should be 1., although it does not really come into play in the calculations. I'll make sure to tell Ian and ask him to update his code as well.

  6. #6
    Junior Member Regular Contributor
    Join Date
    Mar 2009
    Posts
    152

    Re: GLSL noise fail? Not necessarily!

    When I run GLSL-ashimanoise demo I get 'Fragment shader compile error:' message but program runs ok.
    (Windows 7, Geforce GTX 470).

  7. #7
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    That is probably the bug mentioned above. I have now fixed that in the demo. I have also cleaned up the GLSL code a little.

    My own C code was also cleaned up. It's still a hack, but it's not quite as ugly anymore.

  8. #8
    Junior Member Newbie
    Join Date
    Apr 2009
    Posts
    1

    Re: GLSL noise fail? Not necessarily!

    His permutation function drops 0 on 0. Otherwise it is good.
    I do not understand the replacement for the gradient table though...how does that work ?

  9. #9
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    His permutation function drops 0 on 0. Otherwise it is good.
    That 0->0 mapping is not a problem. It is perfectly alright for a permutation to have one or even several fixed points that map to themselves, as long as they do not appear in a too regular pattern.

    The permutation is a permutation polynomial: permute(x) is computed as (34*x^2 + x) mod 289.
    This is one of the two neat and original ideas in Ian's implementation. (The other one is the clever generation of gradients.)
    You can read about permutation polynomials on Wikipedia.
    It is not a new idea in mathematics, it is just new for this application. A proper journal article on this noise implementation is on its way, but please have patience.

  10. #10
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    The github repository has now been updated with some slight speedups, code cleanups and classic Perlin noise in regular and periodic versions.

    2D simplex noise is now only about 20 mult and add operations (including five dot operations), one division, three mod, two floor and one each of step, max, fract and abs.

    I get 1,5 billion 2D noise samples per second on my relatively measly Nvidia GTX260. An ATI HD5870 spits out 5 billion samples per second.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •