NV_POINT_SPRITE

Hi guys, I want to try out the new point sprite extension but after having read the spec a few times I’m not sure how to implement it. There doesn’t seem to be a tutorial/demo for it yet. Does anyone know how to use it? Presumably I need to bind a texture, enable NV_POINT_SPRITE then render the points? enabling Texture_2D and drawing with points is slow and draws nothing. Looking at the spec I think I need to use COORD_REPLACE_NV somewhere but I’m not sure where/why.

Will the point sprite size vary with Z or will it behave like GL_POINTS?

Thanks

You need to call, for each texture unit that you want point sprite texture coordinate generation:

glTexEnvi(GL_POINT_SPRITE_NV, GL_COORD_REPLACE_NV, GL_TRUE);

You can set this on any or all texture units.

On the other hand, the point sprite R coordinate mode (zero, S, or R) is global across all texture units.

  • Matt

Ok thanks but I was looking for more along the lines of (and this doesnt work but its the way I’m currently doing it)

#ifndef GL_NV_point_sprite
#define POINT_SPRITE_NV 0x8861
#define COORD_REPLACE_NV 0x8862
#define POINT_SPRITE_R_MODE_NV 0x8863
#endif

glEnable(GL_TEXTURE_2D);
glPixelStorei(GL_UNPACK_ALIGNMENT, 4);
glGenTextures(1,&Texture);
glBindTexture(GL_TEXTURE_2D, Texture);

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);

glTexEnvf(POINT_SPRITE_NV,COORD_REPLACE_NV,GL_TRUE);


Load Texture....

Main Loop

glEnable(POINT_SPRITE_NV);
glEnable(GL_TEXTURE_2D); (should this be enabled?)
glBindTexture(GL_TEXTURE_2D, Texture); (I must need this surely)
glBegin(GL_POINTS);

Loop through points and render using a single glVertex and glColor. I dont need  glTexCoord2f do I?

glEnd();

With GL_TEXTURE_2D enabled its very slow and draws nothing, with it disabled I just get the standard points.

I have an original Geforce 3 and detonatots 27.7, it should work on this hardware/driver config right?

Originally posted by Adrian:
I have an original Geforce 3 and detonatots 27.7, it should work on this hardware/driver config right?

Not according to the compatibility matrix at the beginning of the document, it won’t. It’s listed as NV25 only.

Damn, you’re right, I only checked the txt document. I kind of assumed since I can see point sprites in the 3D mark I would get it with OpenGL to. Thanks for pointing that out.

doesn’t the compatibility matrix in nvopenglspecs.pdf refer only to the driver version? E.g., GL_EXT_multi_draw_arrays is listed for R25 on NV1x, and I have it in my extension string (GF2). Just check your extensions string to be sure…

Michael

I’ve checked my extension string it’s not there. Interestingly NV_POINT_SPRITE is the only extension only available on the Geforce 4. I would have thought it could be emulated on older hardware…

point_sprites and texture_shaders3 i guess…

too bad, i waited for point_sprites in gl now over one year for my gf2mx…

According to the pdf, texture shader3 is emulated on nv1x but you’ll need the latest drivers(R25) for it.

NVidia has GL_NV_point_sprite, ATi has GL_ATIX_point_sprite. What about making an EXT?

Just a thought …

i want the ****ing point-sprites emulated, not the textureshader3… what do i want with 0.1fps??? i think pointsprites can run quite well on every hardware… as the dx8-pointsprites do in madonions 3dmark2k1…

NV_point_sprite is significantly more general than what is provided in DX; for example, it allows you to replace the coordinate for any or all texture units, and it provides the R coordinate mode control, which is very useful for animated point sprites.

It’s worth noting that the DX documentation on point sprites is pretty horrible – it doesn’t say which texture coordinate is to be replaced.

The simple fact of the matter is that we didn’t want to expose this functionality in anything less than a clean manner. We believe that NV_point_sprite is really the right way to expose it.

It is conceivable that we might accelerate a subset of NV_point_sprite on GF3. There would be restrictions on the modes accelerated (and at least one would be non-obvious in nature). It’s not the sort of thing we would want to build an extension off of.

It’s also possible that we could provide reasonable emulation support for point sprites, but it would of course involve SW T&L.

  • Matt

and? i have the whole vertex programs ALL in software here… so where’s the problem in implementing the pointsprites in software, too? its just so boring to write an own point-sprite engine if you know the ext IS there… and it CAN be done reasonably fast with emulation (compared to emulated texture3d or textureshaders it will be FAST )

what do i want with texture_shader3 on my gf2mx? i can’t use them… but the pointsprites would be very usefull

and yes, i think the spec is well done, and the implementation is more clear than the one of dx8…

dx8 made it simpler: its NOT GL_POINTS anymore, its simply GL_POINT_SPRITES…

glBegin(GL_POINT_SPRITES); … glEnd();

and for every glVertex4f there will be a quad with size w and on pos x,y,z on the screen… with the texture fit on it completely…

we don’t really need more, but the version you invented for gl IS more… enough for everyone for sure… now simply implement it in the driver for everyone…

oh… and find together with ati for GL_EXT_point_sprites… pleaaaaaase

It would be very bad for us to make an extension that created a new primitive type GL_POINT_SPRITES. Extensions are not supposed to add new primitive types, because the primitive types are intentionally packed starting at zero. If we added a new type, and someone else added a new type, the extensions would conflict with one another. It would be a disaster.

Emulation for this extension may arrive in the future, but it’s not a high priority.

  • Matt

Just out of curiosity, why is t inverted for point sprites?

Because that’s how the hardware works. All other reasons are irrelevant in the face of that…

Not my favorite choice, but I didn’t have any say in the matter.

  • Matt

hm ok… thats why never new primitives arrived…
well ok… i don’t think your implementation is bad, i never said… i just noted that for dx its a new primitive, wich made it more easy to define how it has to be…

i only hope you can fit this together with ati sometimes, would be very nice…

I’m attempting to implement point sprits as well, did this discussion ever get a working chunk of code? Here’s what I have right now:

glClientActiveTextureARB (GL_TEXTURE3_ARB);
glTexEnvf(POINT_SPRITE_NV,COORD_REPLACE_NV,GL_TRUE);
glEnable (POINT_SPRITE_NV);
testTexture.mBind (GL_TEXTURE3_ARB);
glEnable (GL_TEXTURE_2D);
glPointSize (10);
glBegin (GL_POINTS);
...
glEnd ();

glDisable (POINT_SPRITE_NV);
glClientActiveTextureARB (GL_TEXTURE3_ARB);
glDisable (GL_TEXTURE_2D);

From the docs it sounded like I ought to use the texture units in descending order, hence the use of GL_TEXTURE3_ARB.

And I’m not getting anything terribly useful (just alot of big points). I’m running on a GF3ti200 (28.32), it sounded like the GF3s would be able to handle point sprites with later drivers.

Can I assume that point sprites set their size after transforming with the projection matrix (ie, can I have them get bigger as I get near them)? Or should I go back to my old vertex program for that?

I’ve got point sprites working on my GF4. I don’t have the GF3 anymore to test on but I also thought point sprites are now supported on the GF3. You could download my Gravity program from www.polygonworlds.com and see if you get point sprites on your GF3.

This is my point sprite code.

	glPointParameterfvEXT(GL_DISTANCE_ATTENUATION_EXT, quadratic);

			glEnable(POINT_SPRITE_NV);
			glTexEnvf(POINT_SPRITE_NV,COORD_REPLACE_NV,GL_TRUE);

glPointSize(PointSize);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, Texture);

glBegin(GL_POINTS);
....
glEnd();

You’ll want to use the distance attenuation extension so that the sprites get bigger as they get closer.

Point sprites aren’t supported on a GF3…

I know, it sucks.