PDA

View Full Version : texturing and precision (fp32, pf16...)



airseb
05-25-2004, 01:30 AM
Hi ! :D
I haven't started yet of programming shaders but i wonder two things :
1)how works (in theory) texturing with shaders (vertex or fragment ? or both ?) :confused:

2)What do we call "precision" (FP32, FP24 or FP16)by speaking of capability of graphic cards ?
:confused:
thanks !

Corrail
05-25-2004, 01:46 AM
1)
Texturing is done via a textureXD function. this function needs a sampler (which represents your texture object) and a texture coordinate. This texture coordinate is taken from the vertex shader and then interpolated (just like the Color in fixed function OpenGL).

2)
When you talk about FP32 precision in vertex/fragment shader you have a floating point type with 32Bit (standart IEEE 754, just like float in C++). But there are also other floating point types: FP16 and FP24. If you have a 4D FP32 vector it will have 4*32 = 128 Bit.
AFAIK nVidias NV3X and NV4X supports FP16 and FP32 and ATI R3XX and R4XX supports FP24.
Of course using a lower-presicion data type will give you some speed but you might loose quality.
But right now it is not possible to choose floating point precision in pure GLSL.

airseb
05-25-2004, 02:03 AM
ok thanks !

jeremyz
05-25-2004, 06:46 AM
GPU's are not IEEE 754 compliant. They are very similar to IEEE 754; for graphics, the subtle differences don't really matter.

On nVIDIA hardware, fp16 can get you a lot of speed-up in the fragment shader; It doesn't really affect performance in the vertex shader.

airseb
05-25-2004, 06:56 AM
nvidia cards support glslang ? somebody has told me that only professionnals could program with it, is it true ?

Corrail
05-25-2004, 09:04 AM
GPU's are not IEEE 754 compliant. They are very similar to IEEE 754; for graphics, the subtle differences don't really matter.
Oh, didn't know that. Thanks for the information!



nvidia cards support glslang ? somebody has told me that only professionnals could program with it, is it true ?
GLSL is available in nVidia drivers since 56.53. But you have to you NVEmmulate or you take a look at http://www.opengl.org/discussion_boards/cgi_directory/ultimatebb.cgi?ubb=get_topic;f=11;t=000051
Or you get a 60-series driver at http://www.3dchipset.com/drivers/beta/nvidia/nt5/6111.php . With these drivers you don't need to "activate" GLSL.

Humus
05-26-2004, 07:12 PM
Originally posted by airseb:
nvidia cards support glslang ? somebody has told me that only professionnals could program with it, is it true ?ATI has supported glslang on R300 boards for a long time now. NVidia has support nowadays too.

airseb
05-27-2004, 10:28 AM
ok thank you guys ! :)