I’m on Mac OS X 10.6.3 with a geforce GTX260 (but problem seems the same on other macs too) : I’m using textures with format GL_LUMINANCE_ALPHA16F_ARB and they don’t seem to work (they don’t retain anything, or show noise). If I change the format to GL_RGBA16F_ARB, it works.
Is there a workaround, trick, apple specific LA16F ?
It wasn’t clear from your first post that you were talking about rendering, and not sampling.
Rendering to L/LA/A/I formats is not supported by any Mac renderer. These formats weren’t allowed by EXT_fbo. They are allowed by ARB_fbo, but not mandated. They’re deprecated in GL3.
R/RG formats are allowed by ARB_fbo, and mandated by GL3. They are renderable on all Mac renderers that export ARB_texture_rg.
Actually LA16 (unsigned int) is not supported in hardware by earlier GPU generations. Integer textures are supported from SM4.0 level hardware (i.e. G80).
OpenGL advertises some texture formats since GL1.0 that are not hardware accelerated. LA16 is one of those.
Oh, ok. So I’ll stick with RG16F. This format is supported since NV40 so that’s ok for me, but do you know about ATI ? I’m a bit worried since I don’t see any reference to RG16F in Appendix G of this document: ATI OpenGL Programming and Optimization Guide
As I know, NVIDIA supports RG16F only since G80 (and it is mentioned so also in the texture format support list you’ve linked).
Actually ATI is the one that supports GL_ARB_texture_rg from an earlier generation, namely since the ATI X1xxxx series.
Indeed NVIDIA has support for RG16F only since G80, but looking at the table they also have support for FLOAT_RG16 since NV40 (NV_float_buffer), which looks to be the same to me ?