Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 1 of 7 123 ... LastLast
Results 1 to 10 of 69

Thread: GLSL noise fail? Not necessarily!

  1. #1
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    GLSL noise fail? Not necessarily!

    I read this somewhat disheartening summary in the slides from the recent GDC presentation by Bill Licea-Kane:

    "Noise - Fail!"

    However, that does not need to be the case any longer. Recent development by Ian McEwan at Ashima Art has given us a new take on hardware friendly noise in GLSL:

    https://github.com/ashima/webgl-noise

    It might not seem like much, but his algorithm has all the hardware-friendly properties you want, some of which my old GLSL simplex noise demo was missing. In summary, it's fast, it's a simple include (no dependencies on texture data or uniform arrays), it runs in GLSL 1.20 and up (OpenGL 2.1, WebGL) and it scales well to a massively parallel execution because there are no memory access bottlenecks.

    Concerning this, I would like to get in touch with some people in the Khronos GLSL workgroup. I was last involved in this around 2003-2004, and my contact list is badly outdated. Are any of the good people in the GLSL WG reading this? My email address is "stegu@itn.liu.se", if you want to keep this private. Just please respond, as I think this is great news.

    /Stefan Gustavson

  2. #2
    Junior Member Regular Contributor
    Join Date
    Mar 2009
    Posts
    152

    Re: GLSL noise fail? Not necessarily!

    Great stuff. I will try it. Thanks for the info.

  3. #3
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    For those who want an easy to run demo:

    http://www.itn.liu.se/~stegu/simplex...ise-ashima.zip

    With the default window size, the sphere covers about 70K
    pixels, so multiply the frame rate with 70,000 to get the
    number of noise samples per second.
    On my ATI Radeon HD 4850, I get 5700 FPS, which translates
    to about 400 Msamples/s. Whee!

    Windows and Linux compatible source code. (Untested on
    Linux, but it should compile and run without changes.)
    Windows binary (.exe) supplied for your convenience.
    Uses only OpenGL 2.1 and GLSL 1.20, so it should compile
    under MacOS X 10.5 as well, if you either run it from the
    command line or create an application bundle and change the
    file paths for the shader files to point to the right place,
    e.g. "../../../GLSL-ashimanoise.frag" instead of
    "GLSL-ashimanoise.frag". You also need the library GLFW
    to compile the demo yourself (see www.glfw.org).

  4. #4
    Member Regular Contributor trinitrotoluene's Avatar
    Join Date
    Sep 2008
    Location
    Montérégie,Québec
    Posts
    362

    Re: GLSL noise fail? Not necessarily!

    I tried the Windows binary on Linux through Wine and it run great. The only "problem" is that the compiler issue a warning for the fragment shader: WARNING: 0:252: warning(#288) Divide by zero error during constant folding.

    Code :
     vec4 ip = 1.0 / vec4(pParam.w*pParam.w*pParam.w, 
                           pParam.w*pParam.w, 
                           pParam.w,0.);

    Of course, changing the last parameter of vec4 to any number than 0 remove the warning and don't modify the noise texture on the sphere.

  5. #5
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    Good catch. Of course the 0. should be 1., although it does not really come into play in the calculations. I'll make sure to tell Ian and ask him to update his code as well.

  6. #6
    Junior Member Regular Contributor
    Join Date
    Mar 2009
    Posts
    152

    Re: GLSL noise fail? Not necessarily!

    When I run GLSL-ashimanoise demo I get 'Fragment shader compile error:' message but program runs ok.
    (Windows 7, Geforce GTX 470).

  7. #7
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    That is probably the bug mentioned above. I have now fixed that in the demo. I have also cleaned up the GLSL code a little.

    My own C code was also cleaned up. It's still a hack, but it's not quite as ugly anymore.

  8. #8
    Member Regular Contributor remdul's Avatar
    Join Date
    Mar 2004
    Location
    The Netherlands
    Posts
    334

    Re: GLSL noise fail? Not necessarily!

    Nice work.

    "Noise - Fail!"
    So the "return 0.0" wasn't due to IP issues but plain old laziness? Interesting twist.

  9. #9
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    Because of my long standing interest in noise, I have had some insight into the painful and drawn-out process of implementing noise() in GLSL. I would venture a guess that the problems have not been primarily because of licensing or patent issues, but for a lack of a good enough candidate, and a resulting fear of premature standardization.

    A noise() function that gets implemented as part of GLSL needs to be very hardware friendly. Previous attempts have been lacking in at least some respects. Ian's code removes two memory accesses in a very clever way, by introducing a permutation polynomial and creating an elegant mapping from an integer to a 2D, 3D or 4D gradient. This is the first time I have seen a clear candidate for a standard noise() function that both runs well as stand-alone shader code *and* scales well to a massively parallel hardware implementation.

    Also, a standard noise() implementation will need to remain reasonably stable over time. You can't expect people to create real time shaders using one version of noise() only to have them look different when a slightly better but different version shows up in the next generation of hardware.

    In short, a standard noise() needs to be hardware *and* software friendly, to enable an efficient implementation in silicon but also allow for a shader fallback with good performance. A standard also needs to be good enough to keep around for a long time. This code delivers on both accounts, I think.

  10. #10
    Super Moderator OpenGL Lord
    Join Date
    Dec 2003
    Location
    Grenoble - France
    Posts
    5,580

    Re: GLSL noise fail? Not necessarily!

    Very interesting.
    one version of noise() only to have them look different when a slightly better but different version shows up
    To me this is the biggest problem.
    Indeed, I saw some problems in Renderman-compliant pipelines because noise() was differently implemented.

    Having a 'custom noise done in GLSL code' makes your shader deterministic and more portable.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •