View Full Version : Perlin Noise involing LOD textures

11-29-2007, 05:29 PM
Ok I am new to using GLSL to create shaders, and I have a current project I am working on and I was wondering if I could get some advice and help. Ok I am working on a project that involves LOD (Level of Detail) textures meaning if you don't already know that more than one texture is visible at one time depending on the eye point. Ok all that aside how do I create noise on a texture (image) during runtime and yes I am passing the image to the shader as a simpleRect. Also how do I create noise at a pixel based off two images that meet at an intersection, mean LOD1Image meets LOD2Image how do I mix the colors and create noise on that pixel.

11-30-2007, 03:41 AM
Ok I am telling your prose is hard to read now, meaning if you don't already know what that means you should look it up in a dictionary.


1) create noise texture : search for perlin noise. On the Orange Book (http://www.3dshaders.com/home/) there is quite a lot of discussion about this.

2) for each pixel =(pixel LOD1Image + pixel LOD2Image) / 2.0

Nicolai de Haan
11-30-2007, 09:32 AM
Hehe Zbuffer.

Instead of pre-computing a noise texture and look it up (aka sample it) in the pixel shader, it's possible to cut down on the number of texture samples by calculation the "blend noise" yourself. Maybe you should also look at the noise function in GLSL?

Chapter 8.9

11-30-2007, 10:20 AM
The GLSL noise() is not implemented by nvidia (returns white), and software emulated by ATI (uber slow).

Do a search in the forum, there have been discussion about noise implementations in recent weeks.

12-02-2007, 07:01 AM
Not white, but zero. ;-)

12-02-2007, 07:15 AM
White for nvidia indeed :)
And about ATI, it is supported from X1000 series and up :

12-02-2007, 03:16 PM
My comment there refers to the dFdx() and dFdy() functions, not the noise functions (which is not supported in hardware).

12-02-2007, 03:28 PM
Ah, my bad. No hardware noise implementation then :(