Floating point 3d texture on nvidia 8600 (MacBook)

I am trying to get some volume rendering code to run on my new MacBook Pro with nvidia 8600 GT.
I want to upload a single channel 32 bit floating point 3d texture to the graphics card and use 16 bit float as internal format.
On my Linux machine with an nvidia 8800 GTX the following call to glTexImage3D works fine:

glTexImage3D(GL_TEXTURE_3D, 0,
GL_LUMINANCE16F_ARB,
xRes, yRes, zRes,
GL_LUMINANCE,
GL_FLOAT,
imgData);

…but it does not work on my MacBook Pro. But if I use:

glTexImage3D(GL_TEXTURE_3D, 0,
GL_LUMINANCE,
xRes, yRes, zRes,
GL_LUMINANCE,
GL_FLOAT,
imgData);

…it works, but with clamped (8bit) texture values.
GL_LUMINANCE16F_ARB should be the same as GL_LUMINANCE_FLOAT16_APPLE, (0x881E).

Does anyone know what internal format to use in order to use floating point 3D textures on a MacBook Pro with an 8600 GT?

What do you mean by “it doesn’t work”, what do you see?

16 bits floating point texture (ARB_texture_float extension) should be supported by you graphic card. How old are your drivers?

I see more or less random noise. It looks like the values are interpreted in the wrong (bit-)format.
I have the latest driver supplied by Apple.

File a bug report at http://bugreport.apple.com/

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.