View Full Version : General GPU Compute problem

11-13-2007, 11:05 PM
HI,if doing tensor operation on GPU use texture.The problem is that the return values of texture loopup in the range of [0,1] with RGBA mode.Whether`s non-normalized of the return sampling data if the data loaded to textures as arbitrary?If not ,how make the texture lookup not normalizing for the return values?

11-14-2007, 01:32 AM
Use floating-point textures

11-14-2007, 08:31 AM
Thanks for Zengar!Your means with the internal format RGBA32F_ARB?

11-14-2007, 09:01 AM
For example. The normal RGBA textures have 8 bits per channel. It is actually a fixed-point representation, with 255 -> 1.0 and 0 -> 0. That is why you can't pack a value greater then one into such texture.

11-14-2007, 01:55 PM
You may want to consider how much range and precision you need. Just because you need range outside [0..1] doesn't neccesarily mean RGBA8 won't do. You can pack and unpack to this range simply by a scale and bias. For instance it's common to use [-1..1] by doing texValue * 2.0 - 1.0. Depending on the distribution of your data it may be better to use fixed point than floating point. Fixed point has better precision than floating point if the data is generally linear in nature.

11-14-2007, 07:48 PM
Thanks!!!I know how to do this:use the extension ARB_float_texture and ARB_float_color_buffer.