
Junior Member
Regular Contributor
Re: GLSL noise fail? Not necessarily!
I was speaking of ATI drivers in particular, where constant expressions don't seem to be identified and collapsed properly. The other thing I noticed, that replacing a division by a constant with a multiplication by the inverse of the constant makes a difference, is something that would perhaps be considered an aggressive optimization (because it changes the exact value of the result somewhat), and I may have been expecting too much there. GLSL is compiled onthefly, after all.
I have absolutely no experience with Intel GPUs.

Junior Member
Regular Contributor
Re: GLSL noise fail? Not necessarily!
The wiki on the Github repository now links to a rewritten cross platform benchmark with a side by side comparison of my old GLSL noise implementation (which was texture bandwidth limited and used lots of texture lookups) with the new computational version.
Github repository wiki
Bottom line: my old version is still twice as fast, because there is a lot of texture bandwidth on a modern GPU, but the new version scales better with massive parallelism, and it mixes well into a shader that is already texture bandwidth limited. It may even come almost for free when combined with a texture intensive shader with untapped ALU resources.

Junior Member
Regular Contributor
Cellular noise in GLSL
Just a quick update: yesterday I wrote some cellular noise functions ("Worley noise") of various flavors for GLSL, using the same pseudorandom permutation method as the Perlin noise implementations that started this thread. It turned out well, and these functions share the advantages of the Perlin noise functions: no arrays or textures, GLSL 1.20 compatible and fast enough to be considered for actual use. The code is still a bit raw and needs some more attention to detail, and the brief writeup probably needs a spellcheck, but I'll do that in the next few days.
We'll see if this ends up on the same Github repository as the Perlin noise, or if it's going to be kept separate. In any case, here's an early release of the GLSL shader functions without any supporting CPU code:
Cellular noise in GLSL
If you need a framework to test it, you can edit the C program I wrote for benchmarking Perlin noise, available from the Github repository (http://github.com/ashima/webglnoise).

Junior Member
Regular Contributor
Re: Cellular noise in GLSL
Wow. 12,000 views and still ticking for this post. It will be fun to see what people make with this!
I sent in a suggestion for a talk at Siggraph, but the reviewers rejected it, which I more or less expected. However, I put quite some effort into creating a fun visual demo using a few of my own and Ian's noise functions, and you might find it useful, educational or just fun to watch:
http://www.itn.liu.se/~stegu/gpunoise/
Note that this particular talk was rejected and will not be featured at Siggraph, so please ignore the references to an oral presentation in the onepage PDF.
You can still find me at Siggraph at our accepted talk "Next generation Image Based Lighting by HDR Video" if you want to meet me there. That talk is about what I really do for a living  I do noise mainly for fun.

Super Moderator
OpenGL Lord
Re: Cellular noise in GLSL
Great, I really like both the "fire" shader and the one on the floor

Junior Member
Regular Contributor
Re: Cellular noise in GLSL
The one on the floor ("flow noise") actually uses a previously unpublished version of 2D noise with rotating gradients and analytic derivatives. I ran into problems extending it nicely to 3D (I need to get rid of a couple of lookup tables in my software version), but I should at least do simplexnoisewithderivative for 2D, 3D and 4D. Those should be straightforward ports from my software versions.
A few different variations on cellular noise and the simplex noise with derivatives should end up in the Github repo eventually, but my daytime job has put this on the backburner for a while. I hope to get my act together soon on this.

Re: Cellular noise in GLSL
I was very interested to see these new developments of simplex noise shaders, particularly because of the removal of the texture lookups. I have recently been looking at procedural noise functions for an application in Physics (not graphics) and am especially interested in their spatial frequency power spectrum, which very quickly throws up artefacts! Some of these artefacts appear in both old (texture lookup based) and new versions of the shader, some I have found fixes for and others not. Based only on the 3d versions of noise generation, briefly these are:
1. Discontinuities at simplex boundaries, seemingly because the contribution from the opposite vertices has not decayed completely to zero. Fixed by replacing the constant "0.6..." with "0.51..." in the code in both versions. (I'm sure someone who knows the simplex geometry can find the exact constant required).
2. Floating point rounding errors when far from the origin (I think someone alluded to this in an earlier post) that can also cause artefacts at simplex boundaries. I fixed these to a degree in the original code by reordering some of the cell skewing calculations, but can't yet see where to in the new code though the same issue seems to be there.
3. Randomness: The original code produces has some residual structure in its power spectrum when averaged over many noise screens. I could remove that (make it look smooth when averaged over several noise screens) by redefining the w (or alpha) values in the lookup texture using an independent random number generator from matlab.
4. Pattern repeats seem very regular in certain directions in the new code. I don't understand how the permutation polynomial works, nor the significance of the constants (289,34,1,7). Can the pitch of the repeat be increased by changing the constants? The repeats are even visible by eye on relatively fine noise screens!
I hope this is useful in some way and am looking forward to the full write up of these techniques...

Re: Cellular noise in GLSL
Would anyone be able to extend this to provide a version that outputs a gradient vector as well as the noise value? This would very useful for generating surface normals.

Junior Member
Regular Contributor
Re: Cellular noise in GLSL
I have not been keeping track of this thread for a while, so I am sorry for being so very late to respond. Your points are all very valid, and I think you should not use these new functions if you want isotropic and statistically well behaved results. The permutation polynomial was chosen for its simplicity, not for its good permutation properties. I do not have enough math skills to evaluate the quality of that permutation from a theoretical standpoint, but I suspect it has many flaws if you look at it closely enough, and that there are many candidates for better choices. The motivation for everything in the current code is that is works, it looks OK and it is fast. Using it for anything else than pattern generation for visuals is out of bounds for the design spec, so to speak.
On the subject of computing the gradient in addition to the noise value, I have code in C to do just that. I have not yet ported it to GLSL except for the 2D case, but it is a reasonably simple matter to do it for 3D and 4D as well. Right now, I have no time to do that, but 2D GLSL code to get you started is in the demo I linked to above:
http://www.itn.liu.se/~stegu/gpunoise/
(Look at the function "srdnoise" in the "flownoise2" shader)
And the 2D, 3D and 4D versions in C are here:
http://www.itn.liu.se/~stegu/aqsis/DSOs/DSOnoises.html
(Look at the functions in the file "sdnoise1234.c")
Posting Permissions
 You may not post new threads
 You may not post replies
 You may not post attachments
 You may not edit your posts

Forum Rules