schr_10

08-26-2010, 06:51 PM

Hi,

what is the expected value - on screen - of this frag shader code (this is coloring a quad and I have deleted irrelevant lines):

"#version 150\n"

"out vec4 color;"

"void main() {"

"float colval=16380.0/16383.0;"

"color = vec4(colval,colval,colval,alpha);"

"}";

whe I use glReadpixels to read the rendered value I get 1.000000. I need to go down to something like 16200.0/16383.0 for the output to be slighlty less than 1. (0.996 or so).

So I figure I have some sort of precision problem in the shader? Where should I be looking to identify the problem?

My framebuffer?

Thanks in advance,

Soren

what is the expected value - on screen - of this frag shader code (this is coloring a quad and I have deleted irrelevant lines):

"#version 150\n"

"out vec4 color;"

"void main() {"

"float colval=16380.0/16383.0;"

"color = vec4(colval,colval,colval,alpha);"

"}";

whe I use glReadpixels to read the rendered value I get 1.000000. I need to go down to something like 16200.0/16383.0 for the output to be slighlty less than 1. (0.996 or so).

So I figure I have some sort of precision problem in the shader? Where should I be looking to identify the problem?

My framebuffer?

Thanks in advance,

Soren