PDA

View Full Version : counting instructions



divide
11-28-2005, 06:35 AM
Hello

Is there a way to know how many instruction a vertex/fragment shader coded in GLSL takes ?

thanks

execom_rt
11-28-2005, 07:15 AM
- on Windows / Linux nVidia boards, use the Cg toolkit and the command line tool 'Cg.exe' and compile your shader with parameters -entry main -oglsl -D__GLSL_CG_DATA_TYPES -D__GLSL_CG_STDLIB -D__GLSL_SAMPLER_RECT -profile (choose vp40, fp40, arbvp1 etc) .. you will get the instructions count for Geforce, Geforce FX and Geforce 6 series) as well as the assembler output.

- on MacOSX with ATI and nVidia boards, use OpenGL Shader Builder, there is a panel which shows the intermediate output (GLSL->Compiler->Assembler (using Apple own assembler format). You will get instructions count as well.

- On Windows with ATI, you need benchmark, there is no official way to get the intermediate output of a shader (AFAIK)

divide
11-28-2005, 08:51 AM
thanks :)

kingjosh
11-28-2005, 02:53 PM
- on Windows / Linux nVidia boards, use the Cg toolkit and the command line tool 'Cg.exe' and compile your shader with parameters -entry main -oglsl -D__GLSL_CG_DATA_TYPES -D__GLSL_CG_STDLIB -D__GLSL_SAMPLER_RECT -profile (choose vp40, fp40, arbvp1 etc) .. you will get the instructions count for Geforce, Geforce FX and Geforce 6 series) as well as the assembler output.Why does one need to define Cg data types and the Cg standard library to compile GLSL code? Are these options (-D__GLSL_CG . . .) documented or published anywhere?

Relic
11-28-2005, 11:08 PM
Searched yourself? This is Google's very first hit:
http://download.nvidia.com/developer/GLSL/GLSL%20Release%20Notes%20for%20Release%2060.pdf

kingjosh
11-29-2005, 12:02 PM
Ok, that didn't help:

In future GLSL-enabled drivers, the preprocessor name __GLSL_CG_DATA_TYPES will be defined if these Cg data types are supported to allow . . . My real question was:

Why does one need to define Cg data types and the Cg standard library to compile GLSL code?

spasi
11-29-2005, 12:55 PM
Originally posted by kingjosh:
Why does one need to define Cg data types and the Cg standard library to compile GLSL code?To take advantage of fixed-point and half data types for better performance on NV hardware. That's my only use of Cg, I haven't had any other need so far.