I am trying to get some volume rendering code to run on my new MacBook Pro with nvidia 8600 GT.
I want to upload a single channel 32 bit floating point 3d texture to the graphics card and use 16 bit float as internal format.
On my Linux machine with an nvidia 8800 GTX the following call to glTexImage3D works fine:
glTexImage3D(GL_TEXTURE_3D, 0,
GL_LUMINANCE16F_ARB,
xRes, yRes, zRes,
GL_LUMINANCE,
GL_FLOAT,
imgData);
…but it does not work on my MacBook Pro. But if I use:
glTexImage3D(GL_TEXTURE_3D, 0,
GL_LUMINANCE,
xRes, yRes, zRes,
GL_LUMINANCE,
GL_FLOAT,
imgData);
…it works, but with clamped (8bit) texture values.
GL_LUMINANCE16F_ARB should be the same as GL_LUMINANCE_FLOAT16_APPLE, (0x881E).
Does anyone know what internal format to use in order to use floating point 3D textures on a MacBook Pro with an 8600 GT?