Texture indexing and value range

Hi,

I have two questions about textures in GLSL:

  1. Is indexing in texture/texture2d/texelfetch row major or column major?
    For example, if I want to get a value of pixel in 3rd row and 2nd column which should I use:
vec4 pixelval = texture(samplerTex, vec2(3*row/height, 2*col/width));

or the other way around:

vec4 pixelval = texture(samplerTex, vec2(2*col/width, 3*row/height));

?
2. Is there any possibility to create texture (GL_RGB, GL_FLOAT) that will not clamp input values to [0 1] range?

Is indexing in texture/texture2d/texelfetch row major or column major?

It’s neither row major nor column major. It’s XY (or, using OpenGL terminology, ST): the X coordinate and the Y coordinate. The X is horizontal, and the Y is vertical.

Is there any possibility to create texture (GL_RGB, GL_FLOAT) that will not clamp input values to [0 1] range?

Are you talking about the texture coordinates or the color values stored in the texture?

If you’re talking about colors, then you need a floating point image format. If you’re talking about texture coordinates, consider using texelFetch if you want integer texels. Or, use a rectangle texture.

Thanks!

Color values. Setting internalFormat to GL_RGB32F (GL_RGB clamps values) in glTexImage2D call did the trick (and drivers update - before it always threw GL_INVALID_VALUE).

Anyway, it strange that GL_RGB32F is not listed as allowed constant for internalFormat in glTexImage2D in OpenGL man pages.

Anyway, it strange that GL_RGB32F is not listed as allowed constant for internalFormat in glTexImage2D in OpenGL man pages.

Considering how many other formats are missing from that page, I’m not surprised. There are no signed integral formats, unsigned integral formats, or signed normalized formats. Indeed, with the exception of the depth/stencil formats and the sRGB formats, I don’t see any format listed on that page that was added to core OpenGL with GL 3.0. Sounds like a documentation bug.

One more question - can pixel pack buffer be used to read GL_RGBA32F texture?

I’ve got texture readback (to CPU memory) working, but if I create a texture with GL_RGBA32F internal format, all I get is zeros (in gDEBugger I can see that the tezture values are as expected).Is something wrong with this code?

       
glBindBuffer( GL_PIXEL_PACK_BUFFER, pixPBuffId );
glBufferData( GL_PIXEL_PACK_BUFFER, bytes, NULL, GL_STREAM_READ );

glReadPixels( 0, 0, width, height, GL_RGBA, GL_FLOAT, 0 );

GLubyte* bufferData = ( GLubyte* ) glMapBuffer( GL_PIXEL_PACK_BUFFER, GL_READ_ONLY );

if( bufferData != NULL )
{
    memcpy( destination, bufferData, imBytes);
    glUnmapBuffer( GL_PIXEL_PACK_BUFFER ); 
}

With code above, for exactly the same texture, but with GL_RGBA internal format, I get correct results.

glReadPixels does not accept GL_RGBA32F ( GL_INVALID_ENUM ).

One more question - can pixel pack buffer be used to read GL_RGBA32F texture?

Yes.

glReadPixels does not accept GL_RGBA32F ( GL_INVALID_ENUM ).

That’s not how pixel transfer format enumerators work. The format just says what components you want (and in which order). The size parameter tells the system what type of values you want to get back.

Have you tried just using glGetTexImage or glReadPixels without the buffer object? Does that work?