I am trying to use 3D texture borders on a Radeon 9600 on a G5. I am making an array with power-of-two-plus-two edge lengths and passing those numbers to glTexImage3D, but it appears to be rounding the sizes down to powers of two, skiewing the border voxels into the visible part of the texture.
Are 3D texture borders supported, and if not, what should the behavior be?
I guess I’ll revert to faking it by using the middle power-of-two-minus-two subvolume of the texture.
I am making an array with power-of-two-plus-two edge lengths and passing those numbers to glTexImage3D, but it appears to be rounding the sizes down to powers of two, skiewing the border voxels into the visible part of the texture.
I’m not sure what you mean by this. Did you query the texture level parameters to verify this? If you have a 16x16x16 RGB image, and you want borders, then you would supply an 18x18x18 RGB image, setting border to 1 in the glTexImage3D() call. Calling GetTexLevelParameter*() with TEXTURE_WIDTH, TEXTURE_HEIGHT, TEXTURE_DEPTH, and TEXTURE_BORDER should return the values that you supplied.
The wrap mode you choose is very important.
Also, if you’re getting some jazz in your image, try setting glPixelStorei( GL_UNPACK_ALIGNMENT, 1 or 2 ) (the default is 4). With borders, the wrong alignment can lead to some unwanted funk.