We are trying to program with ARB_fragment_program,
and we can’t seem to figure out, how to tell opengl to use floating point texture formats as the internal format.
If we, for instance, tell it to use RGB16 as internal format, will it automatically use floating point representation if it is available on the card?
And what if we want something 32 bit floating point format, how can we get that?
Jonas