PDA

View Full Version : Layered rendering



brioche
05-31-2015, 09:19 PM
Hello everyone, I'm beginner in OpenGL and I have a little problem that I can't solve :(! (I'm sorry for my english i'm french x) )

I can't make a layered rendering work. I would like to capture different layers from the scene compared to the depth of the scene. I would like tohave different textures from the scene at specific depths in order to apply different Blur effect at each layers.

I use a geometry shader with gl_Layer.

This is my function who create a Texture 2D Array:


void Texture::initGLTexture2DArray(int bytesperpixel, int width, int height, int layerCount) {
glAssert(glGenTextures(1, &mTexId));
glAssert(glActiveTexture(GL_TEXTURE0));
glAssert(glBindTexture(GL_TEXTURE_2D_ARRAY, mTexId));
glAssert(glTexImage3D(GL_TEXTURE_2D_ARRAY, 0, bytesperpixel, width, height, layerCount, 0, GL_RGB, GL_UNSIGNED_BYTE, NULL));
glAssert(glTexParameteri(GL_TEXTURE_2D_ARRAY,GL_TE XTURE_MIN_FILTER,GL_LINEAR));
glAssert(glTexParameteri(GL_TEXTURE_2D_ARRAY,GL_TE XTURE_MAG_FILTER,GL_LINEAR));
glAssert(glTexParameteri(GL_TEXTURE_2D_ARRAY,GL_TE XTURE_WRAP_S,GL_CLAMP_TO_EDGE));
glAssert(glTexParameteri(GL_TEXTURE_2D_ARRAY,GL_TE XTURE_WRAP_T,GL_CLAMP_TO_EDGE));
}

And I tried to this two functions to add the array at my framebuffer:


void FBO::attachTextureArray(GLenum attachment, Texture *texture) {
glAssert(glFramebufferTexture(GL_FRAMEBUFFER, attachment, texture->getId(), 0));
}
// Using it like this
fboLayered_->attachTextureArray(GL_COLOR_ATTACHMENT0, textureLayered);


void FBO::attachTextureArray2(GLenum attachment, Texture *texture, GLuint layer) {
glAssert(glFramebufferTextureLayer(GL_FRAMEBUFFER, attachment, texture->getId(), 0, layer));
// Using it like this
fboLayered_->attachTextureArray2(GL_COLOR_ATTACHMENT0, textureLayered, 0);
fboLayered_->attachTextureArray2(GL_COLOR_ATTACHMENT1, textureLayered, 1);
}

And this is my render loop:


glm::mat4 modelViewMatrix = camera_->getModelViewMatrix();
glm::mat4 projectionMatrix = camera_->getProjectionMatrix();

//
// Important note before modifying this method :
// see MyRenderer::setViewport for FBO configurations
//
glm::mat4x4 viewToWorldMatrix = glm::inverse(modelViewMatrix);

// draw the scene

fboLayered_->useAsTarget(width_, height_);
glAssert(glDrawBuffers(1, bufs));
glAssert(glClearColor(0.1, 0.1, 0.1, 1.));
glAssert(glClearDepth(1.0));
glAssert(glDepthFunc(GL_LESS));
glAssert(glDisable(GL_BLEND));

fboLayered_->clear(FBO::ALL);// to clear all attached texture

// render ambient and normal
ambientPass(ambientAndNormalLoop_, modelViewMatrix, projectionMatrix, viewToWorldMatrix);

// setup per light rendering : blend each pass onto the previous one
glAssert(glDrawBuffers(1, bufs));
glAssert(glDepthFunc(GL_LEQUAL));
glAssert(glEnable(GL_BLEND));
glAssert(glBlendFunc(GL_ONE, GL_ONE));
glAssert(glDepthMask(GL_FALSE));

// render for each light
lightsPass(mainDrawLoop_, modelViewMatrix, projectionMatrix, viewToWorldMatrix);

// restore parameter
glAssert( glDisable(GL_BLEND) );
glAssert( glDepthFunc(GL_LESS) );
glAssert( glDepthMask(GL_TRUE) );

std::cout << gpuTimers_ << std::endl;

}

Just a last thing the FBOs configuration. I've 2 FBO one to render a "normal" mode and the other to obtain what I would like to.


void MyRenderer::setViewport(int width, int height)
{
width_ = width;
height_ = height;

camera_->setScreenWidthAndHeight(width_, height_);
//Renderer::setViewport(width, height);

fbo_->setSize(width_, height_);

if (textures_[DEPTH_TEXTURE]->getId() != 0)
textures_[DEPTH_TEXTURE]->deleteGL();
textures_[DEPTH_TEXTURE]->initGL(GL_DEPTH_COMPONENT24, width_, height_, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, NULL);
if (textures_[NORMAL_TEXTURE]->getId() != 0)
textures_[NORMAL_TEXTURE]->deleteGL();
if (textures_[COLOR_TEXTURE]->getId() != 0)
textures_[COLOR_TEXTURE]->deleteGL();

if (textureLayered->getId() != 0)
textureLayered->deleteGL();

textures_[NORMAL_TEXTURE]->initGL(GL_RGBA32F, width_, height_, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
textures_[COLOR_TEXTURE]->initGL(GL_RGBA32F, width_, height_, GL_RGBA, GL_UNSIGNED_BYTE, NULL);

// Init Texture 2D Array
textureLayered->initGLTexture2DArray(GL_RGB32F, width, height, 2);

fbo_->bind();
fbo_->attachTexture(GL_COLOR_ATTACHMENT0, textures_[COLOR_TEXTURE]);
fbo_->attachTexture(GL_COLOR_ATTACHMENT1, textures_[NORMAL_TEXTURE]);
fbo_->attachTexture(GL_DEPTH_ATTACHMENT, textures_[DEPTH_TEXTURE]);
fbo_->check();
fbo_->unbind();

fboLayered_->setSize(width, height);
fboLayered_->bind();
fboLayered_->attachTextureArray(GL_COLOR_ATTACHMENT0, textureLayered);
fboLayered_->check();
fboLayered_->unbind();

FBO::bindDefault();
glAssert (glDrawBuffer(GL_BACK); );
glAssert (glReadBuffer(GL_BACK); );
}


I am using a 3D engine given by my professor so I don't really know if you have enought information to see where my problem come from. I'm blocked for severals day I tried everything I found on different forum... :/.

And this is my shaders: 1849
1850
1851
1852
1853

Thank you for your help! :)

Brioche

brioche
06-01-2015, 06:32 PM
I think I found the solution. Everything works well but it's when i'm drawing the texture. I tried with my normal fbo and it works well but it is not layer rendering, so I think to draw it I have to draw layer by layer but I didn't find how to do it.
Can someone explain me how can I draw my result which is the texture 2D array stored in my layered framebuffer layer by layer, or just draw the layer I want to draw?

Thank you in advance for your help!

brioche
06-02-2015, 06:40 AM
up?

I just would like to know how to render only one layer from my layered texture (I'm a pure beginner :( ) x)

GClements
06-02-2015, 07:25 AM
I just would like to know how to render only one layer from my layered texture
You can use glFramebufferTextureLayer() to attach a single layer of a layered texture (e.g. a 2D array texture) to a framebuffer. The result is a non-layered framebuffer (i.e. setting gl_Layer has no effect).

If you use glFramebufferTexture() with a 2D array texture, the framebuffer will be layered, and the layer is specified by writing to gl_Layer in the geometry shader. If no geometry shader is present or it doesn't write to gl_Layer, rendering will use layer zero.

brioche
06-02-2015, 07:34 AM
Thank you for your answer but I did the second way with geometry shader. But my layered texture is in my layered FBO. Now I would like to draw it on my display using a new shader program to apply for example a blur effect, but I won't draw all the layers but just one layer on by one. My goal is to draw all the layers, layer by layer unsing shader programs with different Blur effect on each layer.
Isn't there a way to just get the first layer of my layered texture and draw it on the screen with a new shader program?

GClements
06-02-2015, 07:58 AM
Thank you for your answer but I did the second way with geometry shader. But my layered texture is in my layered FBO. Now I would like to draw it on my display using a new shader program to apply for example a blur effect, but I won't draw all the layers but just one layer on by one. My goal is to draw all the layers, layer by layer unsing shader programs with different Blur effect on each layer.
Isn't there a way to just get the first layer of my layered texture and draw it on the screen with a new shader program?
To use it as a source, bind the texture with glBindTexture(GL_TEXTURE_2D_ARRAY), then access it in the fragment shader via a sampler2DArray uniform variable. The texture() overload for a sampler2DArray takes a vec3 as its second argument, where the third component is rounded to the nearest integer to obtain the layer.

Alternatively, if you're assuming OpenGL 4.3 or later, you can use glTextureView (https://www.opengl.org/sdk/docs/man4/html/glTextureView.xhtml) to create a texture view which treats a single layer of a 2D array texture as a 2D texture. The view can then be bound to a texture unit and accessed via a sampler2D uniform variable in the shader.

reader1
06-02-2015, 08:57 AM
mark it.
which is the texture 2D array stored in my layered framebuffer layer by layer, or just draw the layer I want to draw?

I have just had a suggestion. if you store your texture layered, you must have an ID for each texture. then you may render the relative id for each layer. I don't know if it is right.

reader1
06-02-2015, 09:03 AM
GClements has replied. his will be right. mine can be ignore.

brioche
06-02-2015, 10:56 AM
Thank you for your help I will try it and keep you aware of the result :)

Thanks :)!

brioche
06-02-2015, 10:35 PM
It works thanks ! :)

But i have a little problem, as I told you I have all my textures which is different parts of the scene at different depth intervals. I would like to apply different leve of Blur effect on each layers. I though I could make different shaders with different Blur effect and draw the first layer and the the second and .... And it will render all the scene with different Blur effect. But I only have the last layer drawn.

Is there a way to draw without erasing the layers drawn before and just replace it if the layer is nearer than the other? Or can I do it with only one shader which could be able to select the right layer to draw?

I'm really sorry I just began opengl some weeks ago and I would like to finish it for today x).

Thank you for you help again :)

GClements
06-03-2015, 05:51 AM
But I only have the last layer drawn.
1. Your texture array needs an alpha channel.
2. The texture needs to be cleared to transparent (alpha = 0) before rendering.
3. When you render the layers, blending needs to be enabled.

brioche
06-03-2015, 09:31 AM
Thank you again!

I tried but I don't really understand how to set alpha = 0.

This is my renderer loop with only the drawing part:


FBO::bindDefault();

glAssert( glDrawBuffer(GL_BACK) );
glAssert( glDepthFunc(GL_ALWAYS) );
glAssert( glViewport(0, 0, width(), height()) );

glAssert( glEnable(GL_BLEND) );

glAssert( glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) );

ShaderProgram *shader = sceneManager_->getAsset()->getShaderProgram(blur1ShaderId_);
shader->bind();
glAssert(glActiveTexture(GL_TEXTURE0));
glAssert(glBindTexture(GL_TEXTURE_2D_ARRAY, textures_[LAYERED_COLOR_TEXTURE]->getId()));
screenQuad_->draw();

shader = sceneManager_->getAsset()->getShaderProgram(blur2ShaderId_);
shader->bind();
glAssert(glActiveTexture(GL_TEXTURE0));
glAssert(glBindTexture(GL_TEXTURE_2D_ARRAY, textures_[LAYERED_COLOR_TEXTURE]->getId()));
screenQuad_->draw();


I tried to add glColorMask(0,0,0,1); and glClear(GL_COLOR_BUFFER_BIT) but it doesn't seem to be what I

GClements
06-03-2015, 09:53 AM
The initial setting is glClearColor(0,0,0,0), which is transparent.

The main change to your original code is to set the internalformat parameter to GL_RGBA so that it has an alpha channel.

brioche
06-03-2015, 10:02 AM
I already have an alpha channel in the internalFormat of my texture:

textures_[LAYERED_COLOR_TEXTURE]->texture2DArrayInitGL(GL_RGBA32F, width_, height_, 3, GL_RGBA, GL_UNSIGNED_BYTE, NULL);

I used glClearColor(0,0,0,0) before drawing but it doesn't seem to have an effect :/.

GClements
06-03-2015, 10:23 AM
Is your blur shader preserving the alpha from the input?

Also, as a test, you could have the blur shader execute a "discard" if the input alpha is less than 1.0.

brioche
06-03-2015, 10:38 AM
I tried with discarding, I've the two layers rendered but I have a strnage result x).

1854

GClements
06-03-2015, 11:11 AM
If discarding works but blending doesn't, that suggests that the blur shader isn't setting the alpha correctly in the output.

brioche
06-03-2015, 04:31 PM
I don't really understand how should I set alpha output in my shader. This is my Blur shader do you have an idea ?


precision highp float; // needed only for version 1.30

in vec4 varTexCoord;
out vec4 outColor;

uniform sampler2DArray color;

void main(void)
{

const float blurSizeH = 1.0 / 750.0;
const float blurSizeV = 1.0 / 750.0;
vec4 sum = vec4(0.0);
for (int x = -4; x <= 4; x++) {
for (int y = -4; y <= 4; y++) {
sum += texture(color, vec3(varTexCoord.x + x * blurSizeH, varTexCoord.y + y * blurSizeV, 0)) / 81.0;
}
}

outColor = pow(sum, vec4(1.0/(2.2)));

}

GClements
06-03-2015, 06:49 PM
To clarify: you're calling glClearColor(0,0,0,0) then glClear() after binding the layered (2D array texture) framebuffer and before rendering the scene, right?

brioche
06-03-2015, 07:00 PM
Yes it's what I am doing


glAssert( glEnable(GL_BLEND) );

glAssert( glClearColor(0,0,0,0));

ShaderProgram *shader = sceneManager_->getAsset()->getShaderProgram(blur2ShaderId_);
shader->bind();
glAssert(glActiveTexture(GL_TEXTURE0));
glAssert(glBindTexture(GL_TEXTURE_2D_ARRAY, textures_[LAYERED_COLOR_TEXTURE]->getId()));
glAssert( glClear(GL_COLOR_BUFFER_BIT));

screenQuad_->draw();

shader = sceneManager_->getAsset()->getShaderProgram(blur1ShaderId_);
shader->bind();
glAssert(glActiveTexture(GL_TEXTURE0));
glAssert(glBindTexture(GL_TEXTURE_2D_ARRAY, textures_[LAYERED_COLOR_TEXTURE]->getId()));
glAssert( glClear(GL_COLOR_BUFFER_BIT));

screenQuad_->draw();

But my second draw erase the first one. If I dont put the second glClear() before drawing the scene I've the to textures but the rendering is not what I would like to have:
1855

GClements
06-04-2015, 05:15 AM
Yes it's what I am doing
Your process has two distinct stages:
1. Render the scene into a layered framebuffer backed by an array texture.
2. Render the layers into the window.

The layered framebuffer should be cleared to transparent at the start of step 1. The window should be cleared to your preferred background colour at the start of step 2.

You're showing step 2, which isn't the issue here.

brioche
06-04-2015, 06:01 AM
Thank you very much, but I can't make it ... This is my rendering loop:


glm::mat4 modelViewMatrix = camera_->getModelViewMatrix();
glm::mat4 projectionMatrix = camera_->getProjectionMatrix();

//
// Important note before modifying this method :
// see MyRenderer::setViewport for FBO configurations
//
glm::mat4x4 viewToWorldMatrix = glm::inverse(modelViewMatrix);

// draw the scene
layeredFbo_->bind();
layeredFbo_->useAsTarget(width_, height_);

// Clear the framebuffer to transparent
glAssert( glEnable(GL_BLEND) );
glAssert(glClearColor(0, 0, 0, 0));

glAssert(glDrawBuffers(2, bufs));
glAssert(glClearDepth(1.0));
glAssert(glDepthFunc(GL_LESS));
//glAssert(glDisable(GL_BLEND));

layeredFbo_->clear(FBO::ALL);// to clear all attached texture

// render ambient and normal
ambientPass(ambientAndNormalLoop_, modelViewMatrix, projectionMatrix, viewToWorldMatrix);

// setup per light rendering : blend each pass onto the previous one
glAssert(glDrawBuffers(1, bufs));
glAssert(glDepthFunc(GL_LEQUAL));
glAssert(glEnable(GL_BLEND));
glAssert(glBlendFunc(GL_ONE, GL_ONE));
glAssert(glDepthMask(GL_FALSE));

// render for each light
lightsPass(mainDrawLoop_, modelViewMatrix, projectionMatrix, viewToWorldMatrix);

// restore parameter
//glAssert( glDisable(GL_BLEND) );
glAssert( glDepthFunc(GL_LESS) );
glAssert( glDepthMask(GL_TRUE) );

//Draw texture
layeredFbo_->unbind();
FBO::bindDefault();

glClearColor(0.1,0.1,0.1,1.0);

glAssert( glDrawBuffer(GL_BACK) );
glAssert( glDepthFunc(GL_ALWAYS) );
glAssert( glViewport(0, 0, width(), height()) );

ShaderProgram *shader = sceneManager_->getAsset()->getShaderProgram(blur2ShaderId_);
shader->bind();
glAssert(glActiveTexture(GL_TEXTURE0));
glAssert(glBindTexture(GL_TEXTURE_2D_ARRAY, textures_[LAYERED_COLOR_TEXTURE]->getId()));

screenQuad_->draw();

shader = sceneManager_->getAsset()->getShaderProgram(blur1ShaderId_);
shader->bind();
glAssert(glActiveTexture(GL_TEXTURE0));
glAssert(glBindTexture(GL_TEXTURE_2D_ARRAY, textures_[LAYERED_COLOR_TEXTURE]->getId()));

screenQuad_->draw();

// restore parameter
glAssert( glDisable(GL_BLEND) );
glAssert( glDepthFunc(GL_LESS) );
glAssert( glDepthMask(GL_TRUE) );
glAssert( glDisable(GL_BLEND) );

std::cout << gpuTimers_ << std::endl;

So I enabled GL_BLEND and cleared the framebuffer to transparent.
After that I draw the scene into my layered framebuffer.
I bind the default framebuffer to draw on the screen using glClearColor at the beginning to set the color of the background.

I'm sure I'm close but it doesn't want to works :mad:

Thank you again.

GClements
06-04-2015, 06:29 AM
Okay, that's a bit more complex than I'd originally assumed (deferred shading). Even so, provided that the colour buffer is cleared to transparent prior to rendering, any fragments not drawn should still be transparent afterwards, regardless of shaders, colours, blending modes, etc.

Something else to consider: a layered framebuffer requires that all of the attachments are layered, i.e. colour buffer, depth buffer, and the buffer for the normals. All of the attached textures should have the same number of layers; if gl_Layer exceeds the number of layers for any of the attachments, the result is undefined.

reader1
06-04-2015, 07:15 PM
What is layered rendering princile? If there are several objects in a scene, how to divide them into different layers to render?