View Full Version : Point Sprite Attenuation

08-30-2008, 09:42 PM
So, I've been playing around with point sprites, but I can't seem to get attenuation working right. It seems like no matter how I set the attenuation coefficients moving the camera around doesn't change the size of the point sprite.

The formula I found for the result size was:

size = PointSize * sqrt( 1 / (constant + linear * d + quadratic * d * d))

I figured if I just took the position of the camera and found the distance to the point sprite then I could calculate the size of the point sprite, but it doesn't seem to be working.

Here's an example of how I'm doing it:

float constant = 0.0f;
float linear = 0.5f;
float quadratic = 0.0f;
float thresh = 5.0f;
float min = 1.0f;
float max = 60.0f;

float coefficients[] = {constant,linear,quadratic};
glPointParameterfv(GL_POINT_DISTANCE_ATTENUATION, coefficients);
glPointParameterfv(GL_POINT_SIZE_MIN, &min);
glPointParameterfv(GL_POINT_SIZE_MAX, &max);
glPointParameterfv(GL_POINT_FADE_THRESHOLD_SIZE, &thresh);





So, changing the coefficients will make the point sprite a different size, but actually moving the camera doesn't. Anyone have any ideas? I'm on an NVIDIA GeForce 8600, so I'd have to assume that attenuation works.

I suppose part of it is how does it actually determine the distance between the camera and the vertex? I figure it does some sort of inspection of the projection and modelview matrices and the vertex being transformed. Or if it's just the modelview matrix then I could see the identity matrix being a problem. This seems rather counter-intuitive though.

08-30-2008, 09:48 PM
Try a known working demo (http://www.opengl.org/resources/code/samples/mjktips/particles/index.html) on your system.

08-30-2008, 10:31 PM
Yeah, all the particles in that demo are the same size too. I guess that means that distance attenuation doesn't work on my hardware... or I need a better driver, which is unlikely.

Any other ideas? Because, I mean, with 3.0, point sprites are being given more ephesis, and I'd think attenuation is pretty crucial for most of their applications.

I suppose I don't know a lot about the timeline, but I would have figured an NVIDIA GeForce 8600 would be well past when this sort of thing was commonplace.