Hello everyone,
I hope you may help me with this problem.
It might be not a specific OpenGL ES 2.0 problem since it happens on my android phone but when I tried to make a GL application, I got a weird problem.
My image is partly transparent. Behind the GL Surface is a camera image (for Augmented Reality).
my problem is, when I make the pixels semi-transparent in the fragment Shader I get a weird result. The brightest pixels seem overexpose. It looks like subtracting color values in Photoshop.
This doesn’t happen without transparency and it happens only with the brightest pixels where I used something like:
gl_FragColor = vec4(clamp(textureColor.rgb * lightWeighting.xyz, 0.0, 1.0), 0.5);
I tried clamping in the shader but it didn’t really work. I guess it is because there is still the camera picture behind making it “brighter”.
Anyway, I hope someone here has seen that problem before.
Thank you for your help,
Tobias
P.S. I was trying a bit more to solve this problem.
I found that the Blend-Mode:
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA_SATURATE, GLES20.GL_ONE);
gives a rough solution for this. So when I do that quick-and-dirty solution, it somehow works. But still, there must be a better way to control the “oversaturation”…