imageStore when internal format is GL_LUMINANCE16
I have a 3D texture
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_S, wrapX);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_T, wrapY);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_R, wrapZ);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, iMagFilter);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, iMinFilter);
glTexImage3D(GL_TEXTURE_3D, 0, GL_LUMINANCE16,
Once thisis done, I use glBindImageTexture like so:
Inside the shader:
layout (binding=0,rgba16f) uniform image3D myVol;
I am not having any problems in compiling , but when i use imageStore() to write to myVol, I am not seeing any results.
Now, there could be other things wrong, but before i look further , i wanted to make sure the way I am representing the data is correct.
GL_LUMINANCE16 means (L,L,L,1) wherre each component has the same value. So the number of bits in the internal represetation is 16 per channel , so 16*4 in total. That's why i used GL_RGBA16f in the image declaration in the shader.
Is this correct?
Another question - Also if a image has GL_LUMINANCE16 as internal format representation, what does a format type of GL_LUMINANCE mean? Is it 8 bits per channel?
Another question- Can I bind the texture, specified by ID, to a texture unit at the same time as an image unit. My intention is that I will use the image unit to write to the texture using imageStore and then simply use the texture unit binding to see the new written values inside another shader using ray casting.