Unused alpha channel

Will having an unused alpha chennel in your textures affect speed performance? I know it is a waste of memory.

Cheers,
Robin.

It won’t affect performance, and on many current graphics cards doesn’t even hurt memory either. For example, many (all?) nVidia cards store 24-bit RGB textures as 32-bit.

In fact, uploading 32-bit textures to memory can be faster than uploading 24-bit textures, because the driver doesn’t need to pad the values for it.

j

Hmm… you may want to test this theory.

There are also more complex issues, like hosing your quality depending on the chosen internal texture format, often forced by the user.

Thanks for the help.

One thing I did notice was that choosing GL_UNSIGNED_BYTE rather than GL_UNSIGNED_INT_8_8_8_8 speeded things up for me. This was strange to me but I guess it has to do with the acceleration hardware… perhaps the driver was a bit dumb.

Originally posted by Robin Forster:
One thing I did notice was that choosing GL_UNSIGNED_BYTE rather than GL_UNSIGNED_INT_8_8_8_8 speeded things up for me.

Perhaps you should try GL_UNSIGNED_INT_8_8_8_REV instead.