Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 2 of 2

Thread: Linear Falloff on Built-in OpenGL Lighting

  1. #1
    Junior Member Newbie
    Join Date
    May 2004
    Location
    Santa Barbara, CA
    Posts
    1

    Linear Falloff on Built-in OpenGL Lighting

    I've always had a bit of a problem with OpenGL's attenuation factor, in that it drops to very low lighting value very quickly, but doesn't drop close enough to 0 to be eliminated for a very long time. This makes it very inconvenient to use in any practical environment, because you have to apply the light to a lot of objects on which it has only a minimal effect.

    A much more convenient way of handling falloff is to just have a linear falloff, with 100% brightness at the light's position, and 0% at it's maximum range. It may be a bit less realistic, but it still looks fine in use, and allows a very convenient elimination of uneffected objects from the lighting calculations. Plus, you can make it more realistic, if you prefer, by squaring the resulting brightness. This produces a curve that looks remarkably similar to OpenGL's exponential decay attenuation curve, except that this one actually falls to zero.

    For this reason, I've usually used my own lighting functions. However, I've recently been working on an engine where it would be much faster and more convenient to use OpenGL's built-in lighting, but it's attenuation methods have become a real problem. I need a way to force linear falloff on these OpenGL lights.

    The one idea I had was to try using a 3D texture containing a grayscale sphere, 100% brightness in middle, falling off to 0 on the edges, and then translate this texture to match the light's positions, and modulate it in. Unfortunately, I'm having trouble making 3D textures work properly (which is a separate problem destined for a separate thread), and I have a feeling there is a better way to do this (besides pixel shaders, which I don't want to resort to if I don't absolutely have to). Does anybody have any ideas?
    - Heru Feanor (Mike Powell)

  2. #2
    Advanced Member Frequent Contributor plasmonster's Avatar
    Join Date
    Mar 2004
    Posts
    739

    Re: Linear Falloff on Built-in OpenGL Lighting

    The attenuation model in the gl is quadratic

    atten = 1 / (a + bx + cx^2)

    You could make this nearly linear by solving a system of 3 equations for a, b and c. But for simplicity, lets get rid of the quadratic term by setting c = 0. So we now have only a and b.

    atten = 1 / (a + bx)

    Say at x = 0 you want atten = 1, so a = 1. At x = 1000 let atten = .005, so we have

    1 / (1 + b*1000) = 0.005

    b = (1 / 0.005 - 1) / 1000.

    You can take this to its logical conclusion. But for true linear attenuation, you might consider rolling your own with vertex and/or fragment programs.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •