Fragment program newbie (clueless)...

I want to start using vertex and fragment programs. I’ve found a lot of info on vertex programs, but if I look up fragment info on nvidias developers website all I get is texture and pixel shader demos. Can you only use fragment programs with cg? Or is there an opengl extension I should be looking for (tried looking for NV_fragment_program)? Basically I want to get away from combiners and start working on programing the gpu.

Thanks…

John.

fragment programs are only on dx9 gen cards - ie Radeon 9700 thru GL_ATI_FRAGMENT_PROGRAM extension (anyone seen a spec yet?) and the yet-to-be-released NV30 from NVIDIA with GL_NV_FRAGMENT_PROGRAM (spec for this just released- is on nv site). any cards older than this you are stuck with register combiners or fiddling with tex-env parameters.

ARB is supposedly working on a GL_ARB_FRAGMENT_PROGRAM spec - and that is about all the info i’ve heard about it. perhaps it will be based on GL_ATI_FRAGMENT_PROGRAM…?

Basically I want to get away from combiners and start working on programing the gpu.

When you use NV_register_combiners you ARE programming on the GPU. If you want to use NV_fragment_program then you NEED a NV3x based card. NV_fragment_program will only run in software on any geforce less. Just like how NV_texture_shader runs in software on the original GeForce and GeForce 2, but in hardware on the GeForce 3/4 Ti. Sure you can do fragment programs in Cg, but the NV_fragment_program extension by itself is similar to vertex_program, meaning its assembly language like how vertex_program is. Cg just converts the ‘C’ code to this asm code which is what the GPU understands. NVIDIA also has released a pdf that has some of the new extensions for the NV30 and NV_fragment_program is in there.

-SirKnight

Thanks,

I didn’t realize that the extension is so new (explains why I had difficulty finding info). I was hoping to use it for a gf3/4 driver for my game. Looks like I should stick to texture shaders.

I realize that technically combiners are programming the gpu, but gpu code feels so much cleaner than making hundreds of hard coded c calls. Plus the gpu code can be compiled on the fly (right?). That should make debugging a breeze, I should be able to edit the code while the engine is running, recompile, and then reload the program and instantly see the changes (assuming no parameter changes). Maybe this is a little optimistic (seeing how I don’t know very much about the programs in the first place), but it seems so flexible.

Btw: If I’m real of the mark on this feel free to ruin my grand illusions, I would rather find out now than later…

Thanks for the help!

John.

you can use nvparse for your “problems”. it can realtime setup textureshaders by a texshader scripting language, register combiners by a rc scripting language, or both with dx8 pixelshader assembler language… at runtime, of course…

davepermen,

I’ll take a look at that. It sounds exactly like what I’m looking for.

Thanks!

John.

Plus the gpu code can be compiled on the fly (right?). That should make debugging a breeze, I should be able to edit the code while the engine is running, recompile, and then reload the program and instantly see the changes (assuming no parameter changes).

That’s one of the nice things about the Cg compiler, is that you are allowed to do stuff like this without having to recompile everytime you make a small shader change. I havn’t tried this out yet myself though.

-SirKnight

You still have to recompile a program if you’ve made a source change. Did I misunderstand what you were talking about, SirKnight?

Of course, the compilation is very fast since the programs are small.

Thanks -
Cass

No what I meant is that you dont have to re-compile your WHOLE program, you know like your C/C++ code part, just the shader. Since the Cg compiler can be setup to compile shader code at app run time. I should have made what I said more clear, sorry for the confusion.

-SirKnight

[This message has been edited by SirKnight (edited 09-07-2002).]

Heh, now that I re-read my other post I can see why it was confusing. I don’t know why in the hell I worded it like that. That was just horrible.

-SirKnight