I’m hoping to find out if what I am doing wrong with indices. I have a 3D texture of dimensions: width * height * 10
I use 4 float representation:
texdata = (float*)calloc( 10 * 4 * width * height, sizeof(float));
In 2D I usually go by:
// pixel index
int pindex = y * width + x;
float *pixel = texdata + 4*pindex;
r = pixel[0];
g = pixel[1];
b = pixel[2];
For 3D texture I have assumed that RGBA remain in sequence and depth means an extra image to skip over:
// voxel index
int vindex = z * width * height + pindex;
float *pixel = texdata + 4*vindex;
I used all the 3D texture stuff, and prefer no interpolation.
glEnable(GL_TEXTURE_3D);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_3D, tex3d);
glTexParameteri(GL_TEXTURE_3D,GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_3D,GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexImage3D(GL_TEXTURE_3D, 0, GL_RGBA, width, height, 10,
0, GL_RGBA, GL_FLOAT, texdata);
In fragment shader I use the call:
texture3D(texdata, vec3( s, t, 0 ) );
and I see the first layer just fine. but for:
texture3D(texdata, vec3( s, t, 0+ ) );
I still see the first layer!
I have no shader compile errors, and I don’t get any GL errors for that frame. The render of the first layer is correct. I re-rendered each layer in 2D GL with expected results (albeit my packing order). Is there some packing form that I didn’t declare in PixelStore?
I can’t figure it out