View Full Version : Help! glTexImage3D generates INVALID_ENUM: what am I doing wrong?

09-16-2005, 02:26 AM
I'm trying to generate a 32x32x32 3d texture for doing colour lookups, but I can't seem to get the GL to accept it.

My call looks like this:

glTexImage3D( GL_TEXTURE_3D, 0, GL_RGB, 32, 32, 32, 0, GL_RGB, GL_FLOAT, pixdata );

And It always generates INVALID_ENUM. What am I doing wrong?

I was originally trying to use GL_FLOAT_RGB_NV, but after looking at the spec I switched to using GL_RGB. Can anyone help please!?


09-16-2005, 10:18 PM
I'd guess it doesn't like the float source data sent to a GL_RGB internal format.

09-17-2005, 03:01 AM
Do you mean I need to use a different format instead of GL_RGB (I tried GL_FLOAT_RGB_NV previously), or that I just can't pass float data to it, and instead need to pass GL_UNSIGNED_SHORT or GL_UNSIGNED_BYTE?



09-17-2005, 12:17 PM
How do you define pixdata?

Try to not use pixdata, something like :
glTexImage3D( GL_TEXTURE_3D, 0, GL_RGB, 32, 32, 32, 0, GL_RGB, GL_FLOAT, NULL );

With this you allocate the texture, filled with zeroes or with noise (i'm not sure). If there arent any problem, then your card supports the internal format, and the problem is in pixdata.


09-19-2005, 12:43 AM
Ok, I was being an idiot. The code I posted was ripped from my class that's creating the texture, and the variable I was passing to intformat_ wasn't initialized. Just took me till today to see it.

Thanks for all the help