Results clamped

Hi again. I notice in my fragment shader if i return the value of 9.0 i get 9.0 in my texture that reads from FBO. If i return a texture that i had input in Fragment shader i get values clamped between 0-1.

This is the setup of input texture:

	glBindTexture(GL_TEXTURE_2D, texName);

	//Settng Parameters for texture
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);

	//Moving data to Texture
		glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16F_ARB, size, size, 0, GL_RED, GL_UNSIGNED_BYTE, texPtr);

	char errormsg[50];
	sprintf(errormsg,"Error creating or Binding texture: %s ",textureName);
	checkGLErrors(errormsg);

If i print in a file the buffer array i get normal vvalues betwwen 0-16bit

glGenTextures( 1, &texture1 );
	init2DTexture("texture1",texture1,size,size,buffer);
	free(buffer);

And this is the whole output texture creation and fbo rendering:

	//Setup output texture
	GLuint img;
	glGenTextures(1, &img);
	glBindTexture(GL_TEXTURE_2D, img);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
	glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
	glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16F_ARB,  size, size, 0, GL_RED, GL_UNSIGNED_BYTE, NULL);
	glTexSubImage2D(GL_TEXTURE_2D,0,0,0,size,size,GL_RED,GL_FLOAT,data);

	glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, img, 0);
	checkFramebufferStatus();	
	checkGLErrors("Output Texture Creation");



	glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fbo);
	checkGLErrors("Frame Buffer binding");
	checkFramebufferStatus();

	glDrawBuffer(GL_COLOR_ATTACHMENT0_EXT); 
	glClearColor( 0.0f, 0.0f, 0.2f, 1.0f );
	glPolygonMode(GL_FRONT, GL_FILL); 
	glBegin( GL_QUADS );
	glTexCoord2f(0.0, 1.0); 
	glVertex2f(0.0, 0.0);
	glTexCoord2f(1.0, 1.0); 
	glVertex2f(size, 0.0);
	glTexCoord2f(1.0, 0.0);
	glVertex2f(size, size);
	glTexCoord2f(0.0, 0.0); 
	glVertex2f(0.0, size);
	glEnd();
	checkFramebufferStatus();
	checkGLErrors("Render to FBO");

	//Reading values from frameBuffer to array with flaots
	glReadBuffer(GL_COLOR_ATTACHMENT0_EXT);
	glReadPixels(0, 0, size, size,GL_RED,GL_FLOAT,pixels);

I know its tiring. But i am sure there must be a normal way to get the right values back. Or is there a way to transform the clamped ones i got back, into normal ?

You may want to explain your exact problem a bit better here.
You’re loading the texture data of the input texture as GL_UNSIGNED_BYTE which means that it interprets the data as 8 bit values between 0 and 255 where 0 is mapped to 0.0f and 255 is mapped to 1.0

I don’t see anywhere where you are binding a fragment program/shader? You need to use a shader to be able to render floating point values.

Nico, the call that uses UNSIGNE_BYTE isn’t uploading data, its just allocating (the data ptr is NULL).

Try this.

glClampColorARB(GL_CLAMP_VERTEX_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_READ_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_FRAGMENT_COLOR_ARB, GL_FALSE);

@MalcolmB
I was referring to the input texture creation, not the output texture
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16F_ARB, size, size, 0, GL_RED, GL_UNSIGNED_BYTE, texPtr);

You may want to explain your exact problem a bit better here.
You’re loading the texture data of the input texture as GL_UNSIGNED_BYTE which means that it interprets the data as 8 bit values between 0 and 255 where 0 is mapped to 0.0f and 255 is mapped to 1.0

Yes and i get back results in the same format. Question is what If i want to take back results that are not between 0-1 and are back in 0-255? And basically i need to make it 16bit not 8:$. So should i change to GL_FLOAT for that?

Try this.

Code:

glClampColorARB(GL_CLAMP_VERTEX_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_READ_COLOR_ARB, GL_FALSE);
glClampColorARB(GL_CLAMP_FRAGMENT_COLOR_ARB, GL_FALSE);

I tried this on input and output textures but no luck at all:$

Its not NULL. Is a parameter to the function and its of type unsigned char.:$… Hope this is what you meant.

I use a program and Shader i just didnt post that part.

You’re reading from a 16 bit floating point texture. So you can read back the data in UNSIGNED_SHORT (16 bit) but then you’ll still have the problem that the maximum representable value corresponds to 1.0, so it’ll still be clamped. Maybe you’re better of reading back the results in float and convert it to 16 bit yourself.

To read in float i ll need to change my output texture format from GL_UNSIGNED_BYTE to GL_FLOAT? And i guess to convert it in 16 bit i need to multiply each values with 2^16?

The other question i have then, is why if i put a gl_FragColour=(8.0,0,0,1) i read back 8.0? and not the quivalent normalized in 0-1?

I’m not following you, all your textures are already in GL_RGB16F_ARB format and you’re reading back the values in float so there is no clamping. If you write 8.0 as output in the fragment shader you will get back 8.0. if you sample from a texture in your fragment shader it will return the value in the texture but know that if you uploaded the data to that texture with GL_UNSIGNED_BYTE, the values in that texture will always be in the range 0.0-1.0. There’s a big difference between the internal and external data format. The internal format defines how the data is stored within the texture while the external format defines how the data is represented in system memory.

Aha. Now i understand. So the problem is tha i use Unsigned_Byte at the input texture. I have to use GL_FLIOT.
I tried to use GL_FLOAT but i get exception. i need to convert my data from char to float before.

When you said convert it alone to 16bit what you meant exactly? something like what i described in my previous post?

EDIT

Ok i found what to do and now i get the values i pass normally back…:)…Thansk again:). Only thing i need somehow to change the iamge reader to return floats instead of unsigned char because i cant load them like they are on texture…

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.