Calculating glPointCoords Texture Coords

Hey guys. This is my first post here. I’m a Unity dev. Unity supports glPointSprites through its MeshTopology.Points enum. The only problem is that Unity doesn’t have anything in place for each point’s texture coordinates. Unity doesn’t support gl_coord_replace or anything of that nature. I’ve seen this resource https://www.opengl.org/registry/specs/ARB/point_sprite.txt which shows how point sprite texture coordinates are calculated. I’ve tried this but it doesn’t work. I need working code that calculates st coordinates for point sprites in glsl or CG, or even pseudo-code. Thank you very much.

I don’t know anything about how Unity deals with this; I don’t know if Unity lets you use GLSL directly, and if it does, I don’t know what version it restricts you to.

But as far as (modern) OpenGL is concerned, point sprites are just point primitives. Their size is determined by the last vertex processing shader stage (usually a vertex shader), by writing to [var]gl_PointSize[/var]. And for fragment shaders, the location within the point is provided via [var]gl_PointCoord[/var]. While you can use it as a “texture coordinate”, it’s just a value that tells where you are within the point. You could use it for things completely unrelated to textures.

These values should be available to you in OpenGL 3.0 and above. They may be in older versions too, but they’re at least in 3.0 (and therefore GLSL version 1.20).

Thanks for the reply Alfonse. I’m familiar with all you’ve said. And Unity does support GLSL, but I’m more interested in a method that calculates the fragment’s coordinates rather than the code itself. I can always port code to CG. I need pseudo code or a method that tells me how glPointCoords is calculated (what is done in the vertex shader, what is done in the fragment shader, and so on). The s and t formula provided on the link I posted in the first post doesn’t work. I’m a bit confused as to how to get “the unrounded window coordinates of the vertex” in the fragment shader. This is what I need help on - a methodical layout of what is done. Thank you.

You have already seen the “code itself” that computes gl_PointCoord; you gave a link right to it. You say that it “doesn’t work”, but that’s exactly how it’s generated (conceptually speaking; in all likelihood, no actual shader code computes it). So it’s not that the algorithm is wrong; it’s that your implementation of it is wrong.

Computing gl_PointCoord manually is a pointless exercise. OpenGL already provides it, and your computation of it will almost certainly not be faster than just accessing gl_PointCoord. And gl_PointCoord can easily be transformed into whatever range you need, since it’s a normalized [0, 1] range.

However, if you insist.

As you’ve shown, by reference to the point coord extension, the algorithm for computing gl_PointCoord requires 3 things: the current fragment’s window-space position, the window-space position of the vertex that generated the point, and the point’s size.

The first is provided to the fragment shader: gl_FragCoord. The second and third are not provided. Therefore, you must provide them yourself.

Your vertex shader already has the point’s size; you wrote it to gl_PointSize. So you simply make an output variable and write that same size to that variable, which your fragment shader takes as input.

The vertex position is a bit more complicated. In order to compute that, your vertex shader needs to take the value it wrote to gl_Position and perform these vertex post-processing steps on it. So it has to divide by W, and then apply the viewport parameters. You only need the XY components, so you only need to compute and output a vec2.

But this means that your vertex shader needs access to the viewport parameters. So you need to store those in uniforms and apply them to your program before rendering.

Or you could just use gl_PointCoord…

Unity doesn’t support gl_PointCoord. This is why I have to calculate it myself. As I said earlier, no openGL functions or built in variables. I have calculated the window coordinates of the fragment, and have the point size readily available. What I don’t have is the vertex position. When the vertex pos is passed on to the fragment, it is interpolated. Unity doesn’t support the ‘flat’ keyword. I have the vertex’s position in window coordinates, now I need to know how to send it untouched to the fragment shader. So how exactly do I go about storing them in uniforms and applying before rendering? Thanks.

OK, what you have just described contradicts the OpenGL specification and the concept of a point primitive. However, that contradiction itself is quite illuminating.

First, allow me to explain the impossible part.

Point primitives are defined by a single vertex. All fragments generated by that point would necessarily only have one vertex for source data. So there’s nothing to interpolate; ‘flat’ is no different from ‘smooth’ to a point primitive. So all fragments from a point get the same values.

The only exceptions are the built-in gl_PointCoord and gl_FragCoord.

Given that, it is impossible for you to have “calculated the window coordinates of the fragment”. There is no way to generate the equivalent of gl_FragCoord from a fragment shader without using gl_PointCoord.

What you describe is impossible; you cannot have OpenGL point primitives (via GL_POINTS) and interpolation of per-vertex values. However, you claim to have actually achieved computing gl_FragCoord, which is impossible without interpolation. If this claim is true, there is only one possible conclusion:

You’re not rendering OpenGL points primitives.

Or, to put it another way, Unity is cheating behind your back. My guess is that Unity is turning your “points” into quads; that’s a common trick (it lets you have arbitrarily sized points, and gets around the stupidity of OpenGL point clipping). It would also explain why they would forbid you from using gl_PointCoord. That value wouldn’t be accurate, since point primitives are not actually being rendered. And since it’s a GLSL built-in, they can’t simply overwrite it.

Given that… we can’t help you. The problem you’re encountering is a Unity problem, not an OpenGL problem. So you’ll need to take that up with the Unity folks if you want to compute an equivalent to gl_PointCoord. You know the algorithm, but without knowing exactly what Unity’s actually telling OpenGL to do, I can’t help you.

You need a Unity forum.