PDA

View Full Version : Perlin noise textures?



Humus
05-10-2001, 02:34 PM
I've recently worked a little with perlin noise and found it too be quite useful. For instance to do animated bumpmaps for EMBM to create realistic water. There are way more uses of perlin noise that that and I'm pretty sure plenty of you guys are familiar with it and have used it.

However, for perlin noise to be really useful I need to be able to dynamically create and upload new textures each frame, which takes quite a lot of CPU power ... http://www.opengl.org/discussion_boards/ubb/frown.gif

I thought of a solution to this, namely to put perlin noise textures into hardware. Exactly how to expose it into OpenGL i don't know but I guess it could be used the same way as other textures, you only pass it texture coords etc ... maybe it would need it's own targets, like GL_PERLIN_TEXTURE_1D, GL_PERLIN_TEXTURE_2D and GL_PERLIN_TEXTURE_3D or something ... but that's not an important issue right now.
Except for all those ways it could be useful perlin textures would also have several advantages over normal textures:
It doesn't need any memory storage or uploads, evaluated directly from formula. No texture memory bandwidth needs during the rasterisation, could speed things up a bit. It has infinite resolution.

I figure there may be some drawbacks too (as always http://www.opengl.org/discussion_boards/ubb/wink.gif):
May be a little expensive to implement in hardware? Not sure about that, but at least the 3D perlin function seams quite complex. Not sure about texture filtering. You do of course not need to take four samples and interpolate, one sample is enough since perlin noise is continious, but I guess you would need some sort of mipmapping. I'm not sure how that would be implemented, but it's probably solvable.

Now what do you guys think? Is this a good idea?

V--man
05-10-2001, 07:46 PM
No, not a very good idea I'd say. OpenGL should not become an algorithm machine. The solution is to have better hardware that can upload those textures faster, so everyone can enjoy it.

V-man

j
05-10-2001, 08:12 PM
Interesting idea, but I don't know if that would be the best way to do it. The whole idea behind perlin noise is its customizability. With some simple math like absolutes, sines, and other functions, it can be manipulated to look like almost anything. The problem with having it be a texture type is that most of the customizability would be lost.

Perhaps a type of pixel shader would do the trick. I'm not talking about the current pixel shaders, but something more like vertex programs, with an assembly language-like interface. I guess that would make them pixel programs http://www.opengl.org/discussion_boards/ubb/smile.gif. nVidia implemented perlin noise in a vertex program example, so if something like vertex programs would be available on a per-pixel level, perlin noise might be possible. I don't think that current pixel shaders are up to that task yet, and implementing pixel shaders with even more functionality than current ones would be pretty tough in terms of hardware speed/cost.

As for implementing perlin noise in hardware, there's an interesting page by Ken Perlin (who first thought up perlin noise) about implementing perlin noise in hardware. It's at http://www.noisemachine.com/talk1/13.html

j

Humus
05-11-2001, 02:52 PM
Originally posted by V--man:
No, not a very good idea I'd say. OpenGL should not become an algorithm machine. The solution is to have better hardware that can upload those textures faster, so everyone can enjoy it.

V-man

Well, I'm not sure about that. The shaders are already sort of turning OpenGL into an "algoritm machine" too in a way. Better hardware to upload those texture isn't direct coming fast, AGP4x has been here for a while, still AGP8x isn't expected to be available in any products before 2003.

Humus
05-11-2001, 03:05 PM
Originally posted by j:
Interesting idea, but I don't know if that would be the best way to do it. The whole idea behind perlin noise is its customizability. With some simple math like absolutes, sines, and other functions, it can be manipulated to look like almost anything. The problem with having it be a texture type is that most of the customizability would be lost.

Perhaps a type of pixel shader would do the trick. I'm not talking about the current pixel shaders, but something more like vertex programs, with an assembly language-like interface. I guess that would make them pixel programs http://www.opengl.org/discussion_boards/ubb/smile.gif. nVidia implemented perlin noise in a vertex program example, so if something like vertex programs would be available on a per-pixel level, perlin noise might be possible. I don't think that current pixel shaders are up to that task yet, and implementing pixel shaders with even more functionality than current ones would be pretty tough in terms of hardware speed/cost.

As for implementing perlin noise in hardware, there's an interesting page by Ken Perlin (who first thought up perlin noise) about implementing perlin noise in hardware. It's at http://www.noisemachine.com/talk1/13.html

j

There may be better ways to do it, but I figured that with pixel shaders you'd be able to still have (close to) the customizability you could have by creating them with the cpu. Honestly though I have no experience working with pixel shaders so I don't know how flexible they are ...

Of course textures aren't the only use of perlin noise, but an important one ...
It might actually be very useful to have perlin noise in vertex shaders too. You said that "nVidia implemented perlin noise in a vertex program example". I guess that means you can do it already with vertex shaders. I don't know how long vertex program you need to do that, perhaps it could pay off to have a sort of perlin instruction? Like you feed it with a vertex and it outputs the perlin noise function for that point ... just a thought ...