Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Page 4 of 7 FirstFirst ... 23456 ... LastLast
Results 31 to 40 of 69

Thread: GLSL noise fail? Not necessarily!

  1. #31
    Junior Member Newbie
    Join Date
    Mar 2011
    Posts
    5

    Re: GLSL noise fail? Not necessarily!

    Quote Originally Posted by Alfonse Reinheart
    (though they'll have to work out how compatible it is with the Artistic License 2.0)
    Quote Originally Posted by PkK
    If this was under a more permissive license
    The purpose of the license we used was to allow anyone to use and modify it while maintaining some coherence to contributions and bug fixes initially.

    Also, as with all startups we have to walk a fine line between the desire to selflessly distribute 'cool code' and the interests of our investors.

    That said, we can always create custom licenses, for specific applications. Drop me an email.

  2. #32
    Junior Member Newbie
    Join Date
    Mar 2011
    Posts
    1

    Re: GLSL noise fail? Not necessarily!

    Its nice to see that you've noticed that Perlin's "improved noise" paper forgot to normalize his gradients. I have seen the error migrate into various implementations meant to replace the original algorithm.

    And if I'm reading your shader correctly, you calculate your gradients w/ the same algorithm. If so, then there is an optimization you can use.

    Because of the way the gradients are calculated, the gradients have a uniform magnitude (1 or sqrt(1) for 2D noise, sqrt(2) for 3D noise, sqrt(3) for 4D noise).

    In your implementation, you normalize each gradient before calculating the final weighted sum.

    Because the output of a noise function is a sum of the weighted gradients, and the gradient magnitudes are constant, you can replace the code that normalizes each gradient by dividing the weighted sum by the sqrt() magnitude.

    And since the sqrt() magnitude is constant for each noise function, you can hardcode the 1/sqrt() normalizer as a constant and multiply the weighted sum (instead of dividing).

  3. #33
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    And if I'm reading your shader correctly, you calculate your gradients w/ the same algorithm. If so, then there is an optimization you can use.
    No, the gradients are done in a totally different manner here. It's a more clever method in several ways, but the normals get different lengths.

    Besides, the normalization of Perlin's original gradients do not really matter at all. What matters is that they are all of the same length. A constant scaling of the normals translate to a constant scaling of the final noise value, and it is cheaper to do all the scaling at once at the end, when the final noise value is returned and a multiplication is required anyway to make the value fit nicely in the range [-1,1]. Not scaling the normals is not an error in Perlin's original implementation, it is a deliberate and smart design choice to speed things up. The scalar multiplication with a vector of only ones and zeroes was originally performed in software as a summed selection, not as a dot product. Several floating-point multiplications were saved that way, and that used to make a big difference back in the day.

  4. #34
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    Speaking of inclusion in Mesa: how good are the AMD hardware drivers for Mesa these days?
    I have not been following the Mesa development for some time, but I notice it is still stuck at OpenGL 2.1. This noise version is compatible with GLSL 1.20, so it could still be a good fit.
    Inclusion in Mesa requires an MIT license, though.

  5. #35
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    I just wrote a quick automatic benchmark for the platforms I have available to me. The program runs for 15 seconds and reports the performance to a logfile. Feel free to post your results here.
    Windows benchmark
    MacOS X benchmark
    Linux benchmark
    The Windows archive contains a precompiled EXE file. The other two platforms will require a "make", and possibly an installation of GLFW (www.glfw.org) if you don't have it already. You may also need to edit your Makefile to suit your particular installation.

    Note that the benchmark runs at a very high frame rate on most GPUs, so it makes a big difference if you turn off any desktop compositor you may have running. For Windows 7, switching to fullscreen rendering gave me a 50% performance boost, which means that I saw these very encouraging numbers on my low cost GeForce GTX 260. (4D noise in particular might receive some optimization soon, but I wouldn't expect any huge speedups.)
    GL vendor: NVIDIA Corporation
    GL renderer: GeForce GTX 260/PCI/SSE2
    GL version: 3.2.0
    Framebuffer size: 1920 x 1200 pixels

    2D simplex noise, version 2011-03-22, 1552.3 Msamples/s
    3D simplex noise, version 2011-03-22, 752.4 Msamples/s
    4D simplex noise, version 2011-03-22, 429.7 Msamples/s

  6. #36
    Member Regular Contributor trinitrotoluene's Avatar
    Join Date
    Sep 2008
    Location
    Montérégie,Québec
    Posts
    362

    Re: GLSL noise fail? Not necessarily!

    Result without antialiasing on a radeon 5870 with stock settings
    Code :
    GL vendor:   ATI Technologies Inc.
    GL renderer: ATI Radeon HD 5800 Series
    GL version:  4.1.10524 Compatibility Profile Context
    Framebuffer size: 1920 x 1080 pixels fullscreen
    2D simplex noise, version 2011-03-21, 6914.1 Msamples/s
    3D simplex noise, version 2011-03-21, 3837.2 Msamples/s
    4D simplex noise, version 2011-03-21, 2427.0 Msamples/s
    With antialiasing 24x
    Code :
    GL vendor:   ATI Technologies Inc.
    GL renderer: ATI Radeon HD 5800 Series
    GL version:  4.1.10524 Compatibility Profile Context
    Framebuffer size: 1920 x 1080 pixels fullscreen
    2D simplex noise, version 2011-03-21, 1838.5 Msamples/s
    3D simplex noise, version 2011-03-21, 1519.3 Msamples/s
    4D simplex noise, version 2011-03-21, 1235.4 Msamples/s

  7. #37
    Junior Member Regular Contributor
    Join Date
    Jan 2005
    Posts
    182

    Re: GLSL noise fail? Not necessarily!

    Quote Originally Posted by StefanG
    Speaking of inclusion in Mesa: how good are the AMD hardware drivers for Mesa these days?
    As usual they're great , and sometimes even better than the official ones for old hardware, but still struggling on newer hardware.
    I have not been following the Mesa development for some time, but I notice it is still stuck at OpenGL 2.1.
    New Gl features are added slowly over time, but there's still a few GL 3 ones missing. See http://cgit.freedesktop.org/mesa/mes...n/docs/GL3.txt for details.

    Philipp

  8. #38
    Super Moderator OpenGL Lord
    Join Date
    Dec 2003
    Location
    Grenoble - France
    Posts
    5,580

    Re: GLSL noise fail? Not necessarily!

    Vista SP2:
    Code :
    GL vendor:   NVIDIA Corporation
    GL renderer: GeForce GTX 275/PCI/SSE2
    GL version:  3.3.0
     
    2D simplex noise, version 2011-03-21, 2037.3 Msamples/s
    3D simplex noise, version 2011-03-21, 962.2 Msamples/s
    4D simplex noise, version 2011-03-21, 653.6 Msamples/s

    Default window size, is there a way to convince the .exe to run fullscreen ?

  9. #39
    Member Regular Contributor trinitrotoluene's Avatar
    Join Date
    Sep 2008
    Location
    Montérégie,Québec
    Posts
    362

    Re: GLSL noise fail? Not necessarily!

    Default window size, is there a way to convince the .exe to run fullscreen ?
    To run fullscreen, I have modified one line in the source code because I did not see any command line option accepted by the program.

    Code :
    glfwOpenWindow(1920, 1080, 8,8,8,8, 32,0, GLFW_FULLSCREEN)

    My result were with Ubuntu 10.10

  10. #40
    Junior Member Regular Contributor
    Join Date
    Aug 2001
    Location
    Norrkoping, Sweden
    Posts
    112

    Re: GLSL noise fail? Not necessarily!

    Default window size, is there a way to convince the .exe to run fullscreen ?
    Apart from editing the source and recompiling, not right now.
    I really should change that, but still, it's nice to see some benchmarks from high-end cards. The result from the MacBook Pro I am running right now is not quite as impressive:

    GL vendor: NVIDIA Corporation
    GL renderer: NVIDIA GeForce 9400M OpenGL Engine
    GL version: 2.1 NVIDIA-1.6.18

    2D simplex noise, version 2011-03-21, 197.3 Msamples/s
    3D simplex noise, version 2011-03-21, 82.1 Msamples/s
    4D simplex noise, version 2011-03-21, 38.8 Msamples/s

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •