16bit Short Texture / Bitwise Ops in Shader

Hi,

i am searching the net for two days now trying to figure out how to access short-Values stored in RGBA Components in the shader. I hope someone can help me.

These two functions fill a buffer, upload it and get it back again to print the values. The values back are the same i uploaded, so the values are stored right.


private void createAndFill3DOctreeShortBuffer(){
		octreeShortBuffer = BufferUtils.createShortBuffer(4);
		
		for(int w = 0; w < blockWorld3DTex_Width / 4; w++){
			for(int h = 0; h < blockWorld3DTex_Height / 4; h++){
				//for(int d = 0; d < blockWorld3DTex_Depth / 4; d++){
					for(int i = 0; i < SHORT_PER_OCTREE_TEXEL; i++){
//						octreeShortBuffer.put((short) (-32768 | 32767));//-2255);
						octreeShortBuffer.put((short) 0);//(16384 | 8192 | 4096 | 2048 | 1024));
					}
				//}
			}
		}		
		
		octreeShortBuffer.flip();
	}
	
	private void create3DOctreeTexture(){
		octree_L4_2DTexID = GL11.glGenTextures();
		GL13.glActiveTexture(GL13.GL_TEXTURE0);
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, octree_L4_2DTexID);
		GL11.glTexParameteri(GL12.GL_TEXTURE_3D, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_NEAREST);
		GL11.glTexParameteri(GL12.GL_TEXTURE_3D, GL11.GL_TEXTURE_MAG_FILTER, GL11.GL_NEAREST);
		GL11.glTexParameteri(GL12.GL_TEXTURE_3D, GL11.GL_TEXTURE_WRAP_S, GL12.GL_CLAMP_TO_EDGE);
		GL11.glTexParameteri(GL12.GL_TEXTURE_3D, GL11.GL_TEXTURE_WRAP_T, GL12.GL_CLAMP_TO_EDGE);
		
//		GL11.glPixelStorei(GL11.GL_UNPACK_ALIGNMENT, 1);
		GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGBA16, blockWorld3DTex_Width / 4, blockWorld3DTex_Height / 4, 0, GL12.GL_BGRA, GL11.GL_SHORT, octreeShortBuffer);
		
		GL13.glActiveTexture(GL13.GL_TEXTURE0);
		GL11.glBindTexture(GL11.GL_TEXTURE_2D, octree_L4_2DTexID);
		ShortBuffer pixels = BufferUtils.createShortBuffer(4);
		GL11.glGetTexImage(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGBA, GL11.GL_SHORT, pixels);
		for(int i = 0; i < 4; i++){
			System.out.println(pixels.get(i));
		}
	}

I am using Java and LWJGL. I am creating a texture with given internalFormat, format and type.
What i want is a Texture (1x1) with a short per component, 4 shorts/8 bytes total.

In my shader i access the texel:
octree3DTex is isampler2D


ivec4 octreeData = texelFetch(octree3DTex, ivec2(0, 0), 0);

I am trying to do something like this with the shorts/ints:
Checking if certain bits of the two bytes are set or not.

Example:
short als Bits: 0000 0000 1111 1111
expanded to ivec4 (i think): (16 * 0) 0000 0000 1111 1111


octantVal = ivec2(octreeData.r, octreeData.g);
	v0123.x = ((octantVal.x & 51) == 51  ? true : false) && ((octantVal.y & 51 ) == 51  ? true : false);

octantVal = ivec2(octreeData.b >> 8, octreeData.a >> 8);
	v4567.w = ((octantVal.x & 204) == 204 ? true : false) && ((octantVal.y & 204) == 204 ? true : false);

With the example above the second boolean should be false, the first true.
This is not working at all. The short-Value is/should expanded to int 32bit (well, no, but thats what i was thinking).

GLSL is hard to debug so i “printf” this with help of OpenGL 4.1:


int bc = bitCount(octreeData.r);
	if(bc == 0){
		gs_FragData[0] = vec4(1.0, 0.0, 0.0, 1.0);
	}else if(bc > 0 && false){
		gs_FragData[0] = vec4(0.0, 1.0, 0.0, 1.0);
	}else if(bc == 7){
		gs_FragData[0] = vec4(1.0);
	}else{
		gs_FragData[0] = vec4(0.0);
	}

GL_SHORT is signed, i provide signed values. Even if i fill the texture with 0, the value of bc is 7, the frame is all white. How can this be???

Another weird thing. SHORT is from -32768 to 32767. Providing a negative number or a number greater than 32767 to the buffer results in bc == 0… Is this a Java-Problem?

How can i get the int/short values so i can do bitwise ops like above?
Any help is much apreciated.

In my shader i access the texel:
octree3DTex is isampler2D

You can only use an i-sampler if the texture is an integer texture. And it’s not.

This call:

GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL11.GL_RGBA16, blockWorld3DTex_Width / 4, blockWorld3DTex_Height / 4, 0, GL12.GL_BGRA, GL11.GL_SHORT, octreeShortBuffer);

creates a texture that is a 16-bit normalized unsigned integral texture. It does not create an integral texture format.

If you want to actually create an integral texture, you must do two things:

1: Use an integral image format. GL_RGB16I, for example.

2: Upload integral data. This means the pixel transfer format must end in _INTEGER.

So your call should look something like this:

GL11.glTexImage2D(GL11.GL_TEXTURE_2D, 0, GL_RGBA16I, blockWorld3DTex_Width / 4, blockWorld3DTex_Height / 4, 0, GL_RGBA_INTEGER, GL11.GL_SHORT, octreeShortBuffer);

Thank you very much, that worked :).