Help! glTexImage3D generates INVALID_ENUM: what am I doing wrong?

I’m trying to generate a 32x32x32 3d texture for doing colour lookups, but I can’t seem to get the GL to accept it.

My call looks like this:

glTexImage3D( GL_TEXTURE_3D, 0, GL_RGB, 32, 32, 32, 0, GL_RGB, GL_FLOAT, pixdata );

And It always generates INVALID_ENUM. What am I doing wrong?

I was originally trying to use GL_FLOAT_RGB_NV, but after looking at the spec I switched to using GL_RGB. Can anyone help please!?

Thanks

I’d guess it doesn’t like the float source data sent to a GL_RGB internal format.

Do you mean I need to use a different format instead of GL_RGB (I tried GL_FLOAT_RGB_NV previously), or that I just can’t pass float data to it, and instead need to pass GL_UNSIGNED_SHORT or GL_UNSIGNED_BYTE?

Cheers,

Anders

How do you define pixdata?

Try to not use pixdata, something like :
glBindTexture(texture3Dname);
glTexImage3D( GL_TEXTURE_3D, 0, GL_RGB, 32, 32, 32, 0, GL_RGB, GL_FLOAT, NULL );

With this you allocate the texture, filled with zeroes or with noise (i’m not sure). If there arent any problem, then your card supports the internal format, and the problem is in pixdata.

API

Ok, I was being an idiot. The code I posted was ripped from my class that’s creating the texture, and the variable I was passing to intformat_ wasn’t initialized. Just took me till today to see it.

Thanks for all the help

Anders