View Full Version : Compute shader and GL_TEXTURE_2D_ARRAY

09-12-2014, 09:10 AM

I try to read some data in a FBO (GL_TEXTURE_2D_ARRAY) that contains fragment normals and position to do defered shading in a compute shader.
All the data i read seems to be 0. My FBO is not empty and contains good datas when i display it and my fbo is correectly binded.

I try to acess it like this :

layout (binding=0) uniform sampler2D inColor;
layout (binding=1) uniform sampler2D inNormal;


void main(void)
const uint i= gl_GlobalInvocationID.x;
const uint x = i%1024;
const uint y = (i-x)/1024;
const ivec2 storepos = ivec2(x,y);//gl_GlobalInvocationID.xy);
vec2 coord = vec2(storepos)/vec2(1024.0);
vec4 pos = texture2D(inColor,coord);
vec4 norm = texture2D(inNormal,coord);

value[i] = queryHardShadow(pos,norm); //my wonderfull work xD
// value[i] = pos.x; always black if displayed

Any on have an idea?


carsten neumann
09-12-2014, 10:27 AM
Shouldn't the samplers be of type sampler2DArray if you want to use them with array textures (you'll need vec3 texcoords in that case as well).

09-14-2014, 08:00 AM
Ho, it must be better....

So i try :

layout (binding=0) uniform sampler2DArray inColor;

vec3 norm = texture(inColor,vec3(coord,1)).xyz;

But it still return me dummy vec3(0)...

carsten neumann
09-14-2014, 08:39 AM
I assume it is intentional that you are reading from layer 1 (you are passing in vec3(coord, 1)) and that your array texture has at least two layers filled with real data? Other than that I don't know what's wrong, you could try running in a debug context and see if there are any messages or under an OpenGL debugger.

09-16-2014, 04:35 AM
It works! I've forgotten to bind my fbo as texture color.


A screen shot of new kind of shadow volumes with an hudge model!