I've always had a bit of a problem with OpenGL's attenuation factor, in that it drops to very low lighting value very quickly, but doesn't drop close enough to 0 to be eliminated for a very long time. This makes it very inconvenient to use in any practical environment, because you have to apply the light to a lot of objects on which it has only a minimal effect.
A much more convenient way of handling falloff is to just have a linear falloff, with 100% brightness at the light's position, and 0% at it's maximum range. It may be a bit less realistic, but it still looks fine in use, and allows a very convenient elimination of uneffected objects from the lighting calculations. Plus, you can make it more realistic, if you prefer, by squaring the resulting brightness. This produces a curve that looks remarkably similar to OpenGL's exponential decay attenuation curve, except that this one actually falls to zero.
For this reason, I've usually used my own lighting functions. However, I've recently been working on an engine where it would be much faster and more convenient to use OpenGL's built-in lighting, but it's attenuation methods have become a real problem. I need a way to force linear falloff on these OpenGL lights.
The one idea I had was to try using a 3D texture containing a grayscale sphere, 100% brightness in middle, falling off to 0 on the edges, and then translate this texture to match the light's positions, and modulate it in. Unfortunately, I'm having trouble making 3D textures work properly (which is a separate problem destined for a separate thread), and I have a feeling there is a better way to do this (besides pixel shaders, which I don't want to resort to if I don't absolutely have to). Does anybody have any ideas?