Do you mean I need to use a different format instead of GL_RGB (I tried GL_FLOAT_RGB_NV previously), or that I just can’t pass float data to it, and instead need to pass GL_UNSIGNED_SHORT or GL_UNSIGNED_BYTE?
Try to not use pixdata, something like :
glBindTexture(texture3Dname);
glTexImage3D( GL_TEXTURE_3D, 0, GL_RGB, 32, 32, 32, 0, GL_RGB, GL_FLOAT, NULL );
With this you allocate the texture, filled with zeroes or with noise (i’m not sure). If there arent any problem, then your card supports the internal format, and the problem is in pixdata.
Ok, I was being an idiot. The code I posted was ripped from my class that’s creating the texture, and the variable I was passing to intformat_ wasn’t initialized. Just took me till today to see it.