Proper usage of GL_TEXTURE_2D_ARRAY.

Hi everyone.

I have some problems when I’m trying to use both TEXTURE_2D and TEXTURE_2D_ARRAY in one GLSL FS program. I don’t know if I am doing something wrong with OGL or GLSL but there is a situation:

OpenGL 3.2 or 3.3 (tested on both, nvidia driver 197.45 and 197.44 for 3.3)
Win7 64-bit
GTX260.

Now GL initialization:


    sint32 tex0 = glGetUniformLocation(m_enProgram2, "weights"); 
    sint32 tex1 = glGetUniformLocation(m_enProgram2, "layers"); 

    m_gpu->render.program.use(m_enProgram2);

    glUniform1i(tex0, 0);
    glUniform1i(tex1, 1);

    glActiveTexture(GL_TEXTURE0); 
    glBindTexture(GL_TEXTURE_2D,m_glTexWeights);
    glActiveTexture(GL_TEXTURE1); 
    glBindTexture(GL_TEXTURE_2D_ARRAY,m_glTexLayers);

    m_terrain.draw();

FS shader that works:


#version 140
in      vec2      fCoords;
in      vec3      fNormal;
out     vec4      outColor;

uniform sampler2D      weights;
uniform sampler2DArray layers;

void main(void)
{
	vec4 blend = texture(weights, fCoords).rgba;
	vec4 grass = texture(layers, vec3(fCoords, 2.0) ).rgba;

	//outColor.a = grass.a;
	outColor.rgb = blend.rgb;
};

I assume that in case of FS shown above graphics driver performs
dead code removal so in fact there is no TEX_2D_ARRAY usage at all.

This shader properly displays thexture with weights (looks like
earth continents) on whole quad.

http://img443.imageshack.us/i/goodl.png/

When I am uncommenting the one line that covers Alpha channel, the
result on the screen is wrong. I can see layer number 0 streched from
lower left corner (0.0) to center (0.5,0.5). The rest of quad is black.

http://img819.imageshack.us/i/bad.png/

So the result of uncommenting the line is just like I would write such shader:


void main(void)
{
 outColor.rgba = texture(layers, vec3(fCoords * 0.5, 0.0) ).rgba;
};

I have no idea what I am making wrong :p. Can I use 2d and 2d_array textures at the same shader?

Please help :frowning:

If I set GL state like this:

glActiveTexture(GL_TEXTURE0); 
glBindTexture(GL_TEXTURE_2D,m_glTexWeights);
glActiveTexture(GL_TEXTURE1); 
glBindTexture(GL_TEXTURE_2D,0); 
glBindTexture(GL_TEXTURE_2D_ARRAY,m_glTexLayers);

m_glTexWeights - is treated as LAYER 0 in 2D_ARRAY (so layer 0 is displayed and not weights).

glActiveTexture(GL_TEXTURE0); 
glBindTexture(GL_TEXTURE_2D,m_glTexWeights);
glBindTexture(GL_TEXTURE_2D_ARRAY,0);
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D,0); 
glBindTexture(GL_TEXTURE_2D_ARRAY,m_glTexLayers);

For such situation the array texture is invisible in GLSL (so now when sampling it, i got weights texture color as result).

You should be setting the fragment output color’s alpha value to something. Try 1.0.

Can I use 2d and 2d_array textures at the same shader?

Absolutely. I’m doing it. No problems.

The alpha channel was only an example to show shader with and without TEXTURE_2D_ARRAY.

Actually my FS looks like this:


#version 140
in      vec2      fCoords;
out     vec4      outColor;

uniform sampler2D      weights;
uniform sampler2DArray layers;

void main(void)
{
    vec2 coords = fCoords;
	coords.x = fract(fCoords.x * 714.0);
	coords.y = fract(fCoords.y * 412.0);

	vec4 blend = texture(weights, fCoords).rgba;
	vec4 water = texture(layers, vec3(coords, 4.0) ).rgba;
	vec4 grass = texture(layers, vec3(coords, 2.0) ).rgba;

	outColor = blend.g * grass + (1.0 - blend.g) * water;
};

And this is my order of setting state:


    m_gpu->render.program.use(m_enProgram2);

    sint32 tex0 = glGetUniformLocation(m_enProgram2, "weights"); 
    sint32 tex1 = glGetUniformLocation(m_enProgram2, "layers"); 

    glUniform1i(tex0, 0);
    glUniform1i(tex1, 1);

    glActiveTexture(GL_TEXTURE0); 
    glEnable(GL_TEXTURE_2D);
    glBindTexture(GL_TEXTURE_2D,m_glTexWeights);

    glActiveTexture(GL_TEXTURE1);
    glBindTexture(GL_TEXTURE_2D_ARRAY,m_glTexLayers);

    m_terrain.draw();

In such situation texture 2D is invisible for GLSL. I have added glEnable() even I know it is deprecated in OGL3.0+ just to try, but still it isn’t working.
Can You give me some sample of setting state in your app?
Maybe glIntercept or something simisiar dump?

Thanks

Problem solved. Sometimes easiest things can’t be seen.
Problem laid in passing engine handles to gl functions.

was:
sint32 tex0 = glGetUniformLocation(m_enProgram2, “weights”);
should be:
sint32 tex0 = glGetUniformLocation(m_glProgram2, “weights”);

Thanks for help anyway :).

Nevermind, another stupid bug fixed :).