View Full Version : using glGenTextures with multidimensional array

02-19-2001, 08:47 AM
I'd like to use a multidimensional array to store my textures, but I seem to have some trouble getting glGenTextures to work with it. I have a global variable

GLuint myTextures[8][30];

and down in my texture loading I call

glGenTextures( 8 * 30, myTextures );

however, when I try to compile it I get the error

cannot convert from 'unsigned int [8][30]' to 'unsigned int'

It works fine with one dimensional arrays, and since there's only 8 of the first one I made 8 seperate one dimensional arrays and it works fine. It would be a lot easier and cleaner, however, if I could use the multidimensional array. Can anyone give me some ideas on how to do this? Thanks in advance

02-19-2001, 08:54 AM
Try changing this..

glGenTextures( 8 * 30, myTextures );

to this

glGenTextures( 8 * 30, &myTextures[0][0] );

or this could work too..

glGenTextures( 8 * 30, *myTextures );

The error is because myTextures is basically a GLuint**, and glGenTextures takes a GLuint*.