Texturing and precision (fp32, pf16...)

Hi ! :smiley:
I haven’t started yet of programming shaders but i wonder two things :
1)how works (in theory) texturing with shaders (vertex or fragment ? or both ?) :confused:

2)What do we call “precision” (FP32, FP24 or FP16)by speaking of capability of graphic cards ?
:confused:
thanks !

Texturing is done via a textureXD function. this function needs a sampler (which represents your texture object) and a texture coordinate. This texture coordinate is taken from the vertex shader and then interpolated (just like the Color in fixed function OpenGL).

When you talk about FP32 precision in vertex/fragment shader you have a floating point type with 32Bit (standart IEEE 754, just like float in C++). But there are also other floating point types: FP16 and FP24. If you have a 4D FP32 vector it will have 4*32 = 128 Bit.
AFAIK nVidias NV3X and NV4X supports FP16 and FP32 and ATI R3XX and R4XX supports FP24.
Of course using a lower-presicion data type will give you some speed but you might loose quality.
But right now it is not possible to choose floating point precision in pure GLSL.

ok thanks !

GPU’s are not IEEE 754 compliant. They are very similar to IEEE 754; for graphics, the subtle differences don’t really matter.

On nVIDIA hardware, fp16 can get you a lot of speed-up in the fragment shader; It doesn’t really affect performance in the vertex shader.

nvidia cards support glslang ? somebody has told me that only professionnals could program with it, is it true ?

GPU’s are not IEEE 754 compliant. They are very similar to IEEE 754; for graphics, the subtle differences don’t really matter.

Oh, didn’t know that. Thanks for the information!

nvidia cards support glslang ? somebody has told me that only professionnals could program with it, is it true ?

GLSL is available in nVidia drivers since 56.53. But you have to you NVEmmulate or you take a look at http://www.opengl.org/discussion_boards/cgi_directory/ultimatebb.cgi?ubb=get_topic;f=11;t=000051
Or you get a 60-series driver at http://www.3dchipset.com/drivers/beta/nvidia/nt5/6111.php . With these drivers you don’t need to “activate” GLSL.

Originally posted by airseb:
nvidia cards support glslang ? somebody has told me that only professionnals could program with it, is it true ?
ATI has supported glslang on R300 boards for a long time now. NVidia has support nowadays too.

ok thank you guys ! :slight_smile:

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.