Simplex noise in GLSL

Because my original Perlin noise thread went a bit off-topic after 20 replies, here’s a new thread announcing the very first (to my knowledge) implementation of Perlin simplex noise in a fragment shader.

Simplex noise is better looking AND faster, and a true derivative can be computed without much extra work. I kept the classic noise in there too for comparison. The code is also cleaned up and commented some more.

ZIP file with Win32 executable and full source

As before, I’m developing on really low end hardware, so please post your frame rates if you have anything better than my GeForce 5600XT.

I scrapped the teapot, that code was CPU limited and didn’t do the shader justice. Just start the program, look at the rotating noise-textured sphere and tell me what the FPS counter in the title bar says.

On a side note, that reference implementation Java code from Ken Perlin was REALLY hard to understand! I ended up reading his paper instead and doing everything from scratch.

about 880 FPS on my 6800 (non-GT/Ultra) with the default screen size.

Looks like a lava lamp…can’t look awayoooooh

startup crashes and burns on my 9800xt :frowning:


Fragment shader compile error

ERROR: 0:242: ‘:’ : wrong operand types no operation ‘:’ exists that takes a left-hand operand of type ‘const float’ and a right operand of type ‘const int’ (or there is no acceptable conversion)
ERROR: 0:242: ‘=’ : cannot convert from ‘const int’ to ‘float’
ERROR: 0:243: ‘:’ : wrong operand types no operation ‘:’ exists that takes a left-hand operand of type ‘const float’ and a right operand of type ‘const int’ (or there is no acceptable conversion)
ERROR: 0:243: ‘=’ : cannot convert from ‘const int’ to ‘float’
ERROR: 4 compilation errors. No code generated.

9800xt, Cat4.12beta drivers (most recent), WinXP

Im not currrenly in front of my GeForce 6800 GT but will run the tests later today. Id like to comment on nvidia implementing there GLSL compiler alongside their CG compiler. if you develop on nvidia hardware and then try it on an ati you always get things you overlooked. looking at the language spec nvidia does not have a tight implementation, ati agrees more with the actual spec.

maybe a bit pedantic, but should we not lobby nvidia to have a tighter to the spec implementation so cross development to us developers is easier ?

the advantage to the nvidia developer is that some cg parts and names are included, but i think i would prefer a tighter implementation ?

just a comment really

Hi, just ran it on my GeForce 6800 GT, i got around 880 fps in the default window

Originally posted by paintor:
Id like to comment on nvidia implementing there GLSL compiler alongside their CG compiler. if you develop on nvidia hardware and then try it on an ati you always get things you overlooked.
We’re using NVemulate to force strict compiler warnings, so we’re able to catch any spec violations. Really helpful and does the job (no errors on ATIs).

Sorry about that silly ATI crash. It’s so very easy to forget that promotion from “0” to “0.0” does not happen automatically in GLSL when Nvidias compiler just silently accepts it. I for one would appreciate it if Nvidia tightened up their GLSL conformance, or at least made their non-standard extensions to GLSL something you had to ask for, not something you have to go to great lengths to avoid having.

I have corrected the bug. Now it should work also on ATI and 3DLabs cards.

yep, it now works on my 9800xt at a fantasic 0.1fps :smiley:

Once again, I made a small update to the code at the original URL. Now the 2D noise (both classic and simplex) is significantly faster.

As it turned out, no dependent texture lookups were required for 2D noise, only a direct gradient lookup in a 2D texture. 3D noise has not changed.

4D simplex noise is coming up next. I hope.

Stefan

“bobvodka”, could you please try the 2D noise, both simplex and classic? Code to do this is in main() in the fragment shader, but commented out.

The lines starting with “n=” are in order: 2D classic, 2D simplex, 3D classic, 3D simplex. Comment out the 3D simplex version and try the 2D variants instead. You should be able to run at least the 2D noise on the ATI 9800, because it now has no dependent texture lookups at all. (Note that the code changed a few minutes ago.) If this does not run in hardware either, please tell me.

(When you pick another noise function, you will get warnings that some uniform variables are no longer found, but that’s OK, the 2D noise is not animated and does not use the “time” variable, and classic noise does not use the “simplexTexture” sampler1D.)

I for one would appreciate it if Nvidia tightened up their GLSL conformance, or at least made their non-standard extensions to GLSL something you had to ask for, not something you have to go to great lengths to avoid having.
Stephan, I suggest you to check your shaders against the generic 3DLabs compiler (If use the Shader Designer, it is built-in, in the generic compiler tab)

Hi there

Runs at impressive 0.06 FPS (though the window claims it to be 0.1) on my Radeon 9600XT.

And well, it looks like… i don´t know… it´s a sphere…with some undescribable texture (??).

Certainly it is much more interessting, if you get the animation without illegal use of drugs.

Hope you get it fixed for us ATI guys, soon.

Jan.

PS: I am using the latest official ATI drivers.

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.