How to fix overexposed look?

I went from using GL_RGBA8 to GL_ALPHA8 with glTexImage3D. My images were grayscale and I wanted to slim down on the amount of data used. I was setting Red, Green, Blue, Alpha to the same value. The RGBA8 image looked fantastic. The ALPHA8 image looks like an overexposed photograph. Any idea what I am doing wrong? Anything to look at that should have been adjusted along with the change from RGBA to ALPHA?

When i load 8-bit images I use GL_LUMINANCE8 as the internal format and have never had any issues.

My experience matches BionicBytes’.

Pay close attention to Table 9-4: Deriving Color Values from Different Texture Formats (in the OpenGL Programming Guide). In particular, GL_RGBA textures intuitively lookup as R,G,B,A, whereas GL_ALPHA looks up as 0,0,0,A. (NOTE: LUMINANCE textures lookup as L,L,L,1.). This applies whether or not you are using fixed-function pipe or shaders to do your texturing. Also found in Table 3.25 in the latest GL 4.2 compatibility spec:

Then, if you are using the fixed-function pipe to combine the looked-up texture value with other colors for fragment shading, pay close attention to Table 9-5 and 9-6: REPLACE, MODULATE, DECAL, BLEND, and ADD texture functions (in the OpenGL Programming Guide) – this is also in Tables 3-26 and 3-27 in the latest GL spec:

Map your 0,0,0,A through the texturing function you’re using via tables 9-5 and 9-6 to make sure the math works out the way you want. My guess is it doesn’t, and you probably would prefer using a LUMINANCE8 textures (samples as L,L,L,1) rather than ALPHA8 (samples as 0,0,0,A).

Yep, Dark Photon is exactly right.
That’s why I used GL_LUMINANCE8 format - because of the way Shaders and FF interpret the data.

I would rather suggest using ARB_texture_swizzle for changing the interpretation of the format.

GL_LUMINANCE* and GL_INTENSITY* formats are deprecated.

Also, texture swizzling is much more flexible than what you can achieve with the deprecated formats.

In your case, just create a GL_R8 texture and then use swizzling to configure it as (R,R,R,R), (R,R,R,1), (R,R,R,0) or whatever combination works for your use case.

I tried to use TEXTURE_SWIZZLE but it didn’t work.

I set these…

glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_SWIZZLE_R, GL_RED);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_SWIZZLE_G, GL_RED);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_SWIZZLE_B, GL_RED);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_SWIZZLE_A, GL_RED);

Then I set glTexImage3D to use GL_R8 as internal format and GL_RED as format.

And the picture comes out blank.

Go figure.

When I use GL_LUMINANCE_ALPHA and provide two bytes per voxel (set to the same exact value) it seems to work fine and the image comes out looking nice. My goal however is to slim down the data to only one value, not two.

When I use GL_LUMINANCE stand alone or GL_ALPHA stand alone I get funky looking images.

OK fine, I figured things out.

With glTexImage3D I used GL_INTENSITY as an internalFormat and GL_LUMINANCE as a data format. Works beautifully.

I got the idea from this thread…