pixel shaders in Geforce 3 ??

Hello!
I’ve got an old Geforce 3 ti500 and read today that this old bugger had pixel shader support. Not that I know what the definition for pixel shader are but since this card has no OpenGL fragment shader support that has to be something completly different. I know that this is an old card which compared to the newer ones is quite bad but still it would be nice to take it to the edge before upgrading.

Ok, this wasn’t coding so I ask this instead:
Which extensions is it that Nvidia speaks of when they say that it’s got pixel shaders? Is it multisample and pbuffers and such?

afaik pixel shaders are referred to as “fragment programs” in opengl, so you will need to look for such extensions.
however it could be that the card has shader support under d3d but not in opengl
at least my geforce4 here has same issue… but I am not much into shader stuff so not sure.

The GeForce3 supports PS 1.1 (GF4 PS1.3, Radeon8500 PS1.4). They’re exposed through the GL, but not over a vendor-independent Extensions.
Under the GL you can access those via the registercombiners (GL_NV_register_combiners) and textureshaders (GL_NV_texture_shader). But those functions aren’t as nearly as flexible as glSlang or the ARB-Shaders, that both represent PV/VS2.0.

But you can btw. use vertex shaders (ARB_VS and NV_VS) on GF3/4 (and lower), but those will then get executed on the CPU.

The Geforce 3 just isn’t very “programmable” at all. It’s enough for “pixel shader 1.1” but that doesn’t mean much. In fact, you can get more flexibility out of a Geforce 3 under OpenGL (versus Direct 3D), but it’s quite a mess to program for.

Originally posted by PanzerSchreck:
But you can btw. use vertex shaders (ARB_VS and NV_VS) on GF3/4 (and lower), but those will then get executed on the CPU.
ARB_vertex_program is hardware accelerated on Geforce 3 and Geforce 4Ti. It’s software on Geforce 1 and 2 and supposedly something in between on Geforce 4MX.

Thanks you all!
So, the only way too use pixel shaders on an old Gf3 is to use nv extensions. Is those extensions in any way implemented by Ati or is everything made with them only working on Nvidia graphic cards?
As I said earlier, it would be nice to make this card show it’s full potential but it doesn’t feel alright to do it in an Nvidia only enviorment.
Perhaps it’s time to get something new, the Geforce 6800 is due soon… :slight_smile:

Of course ATI cards are capable to do the same, even those old ATI cards. However to that time shaders were still very new, therefore every vendor has its own proprietary extensions (i think ATI_fragment_shader is the equivalent).

However, i tell you that programming with those old extensions doesn´t make that much fun, because they are way to powerless and also quite complex.
Therefore you might want to take a look into NV_register_combiners, but when you understood the basics, you should upgrade your card and switch to glSlang or ARB_fragment_program.

Jan.

You can still get something nice out of that old bugger:) I use dot3 and shadow volumes on my gf2 for essentially doom3 like renderer and it’s nice and optimized(same thing as what JC did I assume). It’s usable in a fps game. So don’t feel bad you don’t have the latest and greatest. It’s more satisfying to work with what you have and create good results with it.

The gf3/4 and ati 8500/9000/9100/9200 use proprietary shaders. The shaders on gf3/4 are less flexible because they’re pre-canned programs that the card executes. The ati shader is more free form, it’s essentially a two pass mechanism that allows you to do some extra tricks.

The shaders on gf3/4 are called texture shader1/2/3 I think and ati has fragment shader. This would be ps 1.1 on gf3 and 1.2/1.3 on gf4 and 1.4 on ati under direct3D. All cards that can do ps2 I believe can also do the lower version ones, like nv doing ps 1.4 despite it being ati’s baby. It might run it a bit slower than ati card because of that two pass architecture that ps2/3 nv cards have to emulate.

Basically, you can still get something out of gf3 and it won’t look all that bad. Perfect for learning gfx stuff. The same maths apply to ps2/3 as well. Major being the dot3. Actually, I recommend you write a recursive raytracer to get an idea of what these gpus are trying to do. The vertex shaders are hw accelerated on gf3 and up. You could also use CG for gf3 if you prefer easier shader writing than texture shaders. Though the results might not be optimal. Depends on your need for speed.

Edit: forgot to say that you get lot of power by combining nv register combiners with arb crossbar and texture shaders. Lot more that d3d ps 1.1 so I read. D3d doesn’t have crossbar so you’ll burn the first texture stage/unit and thus your program will multipass more. There is lots of good info in the nv texture shader specs. Worth a read imo.