3D texture and GL_LUMINANCE8

Hello

With the code:

glTexImage3D(GL_TEXTURE_2D_ARRAY_EXT,0,GL_LUMINANCE8,512,512,2,0,GL_LUMINANCE,GL_UNSIGNED_BYTE,tex2Darray);

(tex2Darray pointing to a GLubyte array of the right dimension 5125122 bytes)

glGetError() gives the value 0x500 “GL_INVALID_ENUM”

Is there a problem with arrays of 2D texture with GL_LUMINANCE8?

Perhaps should I use glTexStorage3D() and glTexSubImage3D()? But what should be the “internalformat” of the first and the “format” and “type” of the second to accept GL_LUMINANCE8 texture?

Thanks
Cathy L.

Oops that’s not the right place for this post, sorry for that, could anybody move it? Thanks ans sorry again :-[

Which version?

[ul]
[li] OpenGL 2 doesn’t have GL_TEXTURE_2D_ARRAY_EXT (or GL_TEXTURE_2D_ARRAY).
[/li][li] GL_TEXTURE_2D_ARRAY_EXT is only available if the EXT_texture_array extension is available.
[/li][li] OpenGL 3 has GL_TEXTURE_2D_ARRAY but the core profile doesn’t have GL_LUMINANCE or GL_LUMINANCE8.
[/li][/ul]
EXT_texture_array was written against 2.0, so if you have EXT_texture_array, I believe that it should work. In OpenGL 3 compatibility profile, it should work with GL_TEXTURE_2D_ARRAY.

[QUOTE=GClements;1255131]Which version?
[ul]
[li] OpenGL 3 has GL_TEXTURE_2D_ARRAY but the core profile doesn’t have GL_LUMINANCE or GL_LUMINANCE8.
[/li][/ul]
[/QUOTE]

Hello
Thanks for your answer.
I’m switching old OpenGL 1 code to 3.3 and I didn’t know that GL_LUMINANCE/8 was deprecated. Thanks to you, I discovered that. Now googling around, I found that the way to load 1-byte monochrome texture is to use GL_RED.

Thanks again
Cathy L.