[QUOTE=GClements;1280781]Realistically, you need to render the scene into a texture rather than directly the default framebuffer (i.e. window). The framebuffer needs enough precision to store linear (non-gamma-corrected) intensities and enough range not to have to clamp emissive surfaces (e.g. the surface of a lamp should be something like a thousand times brighter than a brightly-lit white wall).
If you try to apply bloom to a gamma-corrected and clamped sRGB image, you typically won’t be able to distinguish between bright surfaces and (much brighter) light sources, although only the latter should produce a noticeable bloom.[/QUOTE]
Thanks you again, I have got in trouble when trying to achieve same effect on scene with light.
Before the bloom effect I want to ensure that my render to texture works.
Unfortunately I am getting some “NOISE” on results, may you please tell me what am I doing wrong?
The “NOISE” can be seen as black pixels/ UI components/old rendered texture
on the rendered texture inside the frame buffer (not the default one )
Its seems like the shader buffer wont get cleared or something :
http://s8.postimg.org/6fvrrgtsl/glsh.png
This is the fragment shader I use:
precision mediump float; // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
uniform vec3 u_LightPos; // The position of the light in eye space.
uniform sampler2D u_Texture; // The input texture.
varying vec3 v_Position; // Interpolated position for this fragment.
varying vec4 v_Color; // This is the color from the vertex shader interpolated across the
// triangle per fragment.
varying vec3 v_Normal; // Interpolated normal for this fragment.
varying vec2 v_TexCoordinate; // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
void main()
{
// Will be used for attenuation.
float distance = length(u_LightPos - v_Position);
// Get a lighting direction vector from the light to the vertex.
vec3 lightVector = normalize(u_LightPos - v_Position);
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
float diffuse = max(dot(v_Normal, lightVector), 0.0);
// Add attenuation.
diffuse = diffuse * (1.0 / (((0.5)*(distance))));
// Add ambient lighting
diffuse = diffuse + 0.6;
// Multiply the color by the diffuse illumination level and texture value to get final output color.
gl_FragColor = (v_Color * diffuse * texture2D(u_Texture, v_TexCoordinate));
}
Here is the frame buffer installation code(frame dimensions 1250^2) :
public int InitiateFrameBuffer(int fbo, int tex, int rid)
{
//Bind Frame buffer
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fbo);
//Bind texture
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, tex);
//Define texture parameters
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,frameWidth, frameHeight, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
//Bind render buffer and define buffer dimension
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, rid);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, frameWidth, frameHeight);
//Attach texture FBO color attachment
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, tex, 0);
//Attach render buffer to depth attachment
GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, rid);
//rest
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, 0);
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
System.out.println(GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER));
return tex;
}
Installation of view port :
@Override
public void onSurfaceChanged(GL10 glUnused, int width, int height)
{
width=1250;
height=1250;
// Set the OpenGL viewport to the same size as the surface.
GLES20.glViewport(0, 0, width, height);
// Create a new perspective projection matrix. The height will stay the same
// while the width will vary as per aspect ratio.
final float ratio = (float) width / height;
final float left = -ratio;
final float right = ratio;
final float bottom = -1.0f;
final float top = 1.0f;
final float near = 1f;
final float far = 1000.0f;
Matrix.frustumM(mProjectionMatrix, 0, left, right, bottom, top, near, far);
}
Config i use :
GLES20.glEnable(GLES20.GL_DEPTH_TEST);
GLES20.glDepthFunc(GLES20.GL_LEQUAL);
GLES20.glFrontFace(GLES20.GL_CCW);
GLES20.glEnable(GLES20.GL_CULL_FACE);
GLES20.glCullFace(GLES20.GL_BACK);
I am using power of 2 frame buffer size , before drawing i am binding the created frame buffer
Then i draw inside it the sphere .
Next step i return to default frame buffer and trying to draw the rendered texture to simple quad mesh .
W/O frame buffer the results of the sphere texture are O.K .