PDA

View Full Version : texture2D refuses to sample 1-channel texture



Douglass Turner
10-27-2010, 02:14 PM
Hi,

I am writing shaders for OpenGL ES on iPhone OS. Within a fragment shader I was shocked to discover the following:

These all work:
vec4 rgba = texture2D( myRGBATexture, v_st);
float r = texture2D( myRGBATexture, v_st).r;
float g = texture2D( myRGBATexture, v_st).g;
float b = texture2D( myRGBATexture, v_st).b;

These ALL return a constant value of zero (0), ignoring the contents of the one channel texture (!?!). Note, at texture creation time OpenGL says I have a texture colr as GL_ALPHA and data type as GL_UNSIGNED_BYTE:

float a = texture2D( myOneChannelTexture, v_st).r;

float a = texture2D( myOneChannelTexture, v_st).g;

float a = texture2D( myOneChannelTexture, v_st).b;

float a = texture2D( myOneChannelTexture, v_st).a;


Can someone please explain why GLSL forces me to waste tons of memory with a 3-channel texture when a 1-channel texture is all I need?

Thanks,
Doug

feelgood
10-27-2010, 03:09 PM
As per the OpenGL specification GL_ALPHA maps to the following { RGBA } vector:



{ R = 0, G = 0, B = 0, A = A }


The following fragment shader demonstrates how to sample from GL_ALPHA:



uniform sampler2D S; // Sampler referencing ALPHA
varying vec2 V; // texture coordinate

void
main( void )
{
// Map RGBA to AAA1
gl_FragColor = vec4( texture2D( S, V ).aaa, 1.0 );
}


Look at section(s) 3.6. (PIXEL RECTANGLES), 3.7 (TEXTURING) of the OpenGL (ES) API specification for more information.

nickels
10-27-2010, 03:25 PM
Can someone please explain why GLSL forces me to waste tons of memory with a 3-channel texture when a 1-channel texture is all I need?


I believe it is something to the effect that variables declared in shaders are going to be 16 byte aligned anyway, which must have something to do with the register granularity.
You can certainly declare a vec4 instead of a float and use all 4 components to different ends.
A nice side effect is that even though you may change the texture type the shader won't necessarily have to change (e.g. I could swap out a luminance texture and my color shader would correctly function as a grayscale shader)...
I don't necessarily know what I am talking about, but its a guess based on various intimations I have picked up...

chenjs
11-13-2010, 05:26 AM
Use GL_LUMINANCE instead of GL_ALPHA