Weird FBO behaviour

Hello,

I’m having some trouble rendering to a texture. I’m not quite sure what’s going on: http://i43.tinypic.com/1rq9nm.png

The texture2D attachment gets corrupted image data it seems. Also adding a renderbuffer causes the texture2D attachment to stop working all together :confused:

At first I thought the problem is that I’m using the texture unit I’m rendering to in the shader (well, not really, but it is bound there, see fragment shader below) so I decided to load a dummy texture into GL_TEXTURE1 and have the shader sampler set to that when rendering the sphere and then when I’m rendering the cube, swap the shader’s sampler back. However, that didn’t change anything at all. glCheckFramebufferStatus returns all OK as well.

FBO setup:

// generate a framebuffer object
glGenFramebuffers(1, &fbo_);
glBindFramebuffer(GL_FRAMEBUFFER, fbo_);

// generate the texture for rendering
glActiveTexture(GL_TEXTURE0);
glGenTextures(1, &texture_);
glBindTexture(GL_TEXTURE_2D, texture_);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 512, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

// attach the texture to the FBO
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture_, 0);

// generate a renderbuffer for the depth info
glGenRenderbuffers(1, &rbo_);
glBindRenderbuffer(GL_RENDERBUFFER, rbo_);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, 512, 512);

// attach the renderbuffer
//glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, rbo_);

// bind the fragment shader output
GLenum drawBuffers[] = {GL_COLOR_ATTACHMENT0};
glDrawBuffers(1, drawBuffers);

// switch to the default framebuffer
glBindFramebuffer(GL_FRAMEBUFFER, 0);
glBindRenderbuffer(GL_RENDERBUFFER, 0);

main render function:

// bind to the dummy texture
program_.setUniform("texture_", 1);
glBindFramebuffer(GL_FRAMEBUFFER, fbo_);
setupMatrices();

program_.setUniform("texRenderPass_", true);
model_.render();
program_.setUniform("texRenderPass_", false);
	
glBindFramebuffer(GL_FRAMEBUFFER, 0);
program_.setUniform("texture_", 0);
setupMatrices();

glBindVertexArray(cube_);
glDrawArrays(GL_TRIANGLES, 0, CUBE_VERTICES);

I don’t think the problem is with the shaders. In the vertex shader I just find needed data for Phong’s lighting model and pass the texture coordinates to the fragment shader, which does the following:

vec3 ambDiff;
vec3 spec;
calculatePhongShading(ambDiff, spec);

if(texRenderPass_) {
	fragColor_ = vec4(ambDiff + spec, 1.0);
}
else {
	vec4 texColor = texture(texture_, texCoord_);
	fragColor_ = vec4(ambDiff, 1.0) * texColor + vec4(spec, 1.0);
}

I’m seriously confused with the results I’m getting… To me, it doesn’t make any sense that once I attach a renderbuffer, the texturing breaks completely (why is the texture corrupted to begin with?)… Hope you guys can shed some light on this.

Thanks in advance!

At this point, I suggest adding glBindTexture(GL_TEXTURE_2D, 0).

Does setupMatrices set the viewport to the dimensions of the framebuffer? Also, do you call glClear()?

Where are you attaching the renderbuffer? If you attach it to GL_COLOR_ATTACHMENT0, it will be used instead of the texture.

Now I get the same result as on the left side of the posted picture: nothing but a box.

Viewport is the same as the window size: as soon as I started seeing problems I made sure to make the window the same size as the texture I want to render to. Originally I had different matrices for both renders but for ease of debugging I’m using the same for both.

Here’s the function:

// get the size of the viewport
GLint viewport[4];
glGetIntegerv(GL_VIEWPORT, viewport);

// generate the main matrix data
glm::mat4 projectionMatrix = glm::perspective(45.0f, static_cast<float>(viewport[2]) / static_cast<float>(viewport[3]), 0.1f, 100.0f);
glm::mat4 modelviewMatrix = glm::lookAt(glm::vec3(0.0f, 0.0f, 5.0f), glm::vec3(0.0f, 0.0f, 0.0f), glm::vec3(0.0f, 1.0f, 0.0f));

// extract the top left 3x3 from the modelview matrix
glm::mat3 normalMatrix = glm::mat3(modelviewMatrix);

// normal matrix = ((modelview3x3)^-1)^T
normalMatrix = glm::inverse(normalMatrix);
normalMatrix = glm::transpose(normalMatrix);
		
// apply the matrices
program_.setUniform("mainMatrices_.projectionMatrix_", projectionMatrix);
program_.setUniform("mainMatrices_.modelviewMatrix_", modelviewMatrix);
program_.setUniform("mainMatrices_.normalMatrix_", normalMatrix);

// set up the lighting
glm::vec4 lightPos(0.0f, 0.0f, 3.0f, 1.0f);
lightPos = modelviewMatrix * lightPos;
glm::vec3 lightIntensity(1.0f, 1.0f, 1.0f);
program_.setUniform("light_.position_", lightPos);
program_.setUniform("light_.intensity_", lightIntensity);

glClear(GL_COLOR_BUFFER_BIT);

Here’s the renderbuffer segment:

// generate a renderbuffer for the depth info
glGenRenderbuffers(1, &rbo_);
glBindRenderbuffer(GL_RENDERBUFFER, rbo_);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, 512, 512);

// attach the renderbuffer
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, rbo_);

Thanks in advance once more!

You’ll need to bind the texture when you want to render from it, into the window. But it shouldn’t be bound to a texture unit while you’re rendering into an FBO to which it’s attached. The actual rules are slightly more relaxed, but it’s best to play it safe.

Depth buffer? Your code shows one being attached to the FBO, although I don’t know if it’s enabled (there’s not much point in attaching one if it isn’t). If it’s present and enabled, it needs to be cleared before use.

Herp-a-derp. I swear I tried glClear() with GL_DEPTH_BUFFER_BIT… Anyways, it seems that was the problem indeed. I don’t really need the renderbuffer so I tried removing it but the problem then came back again. I read that glEnable/glDisable(GL_DEPTH_TEST) around the rendering sequence for the texture would fix it but that didn’t seem to work for me. Any ideas?

Thanks!

Might as well post here instead of a new thread:

I tried adding another FBO with a different texture attachment but as soon as I call glFramebufferTexture2D() for the second buffer, everything breaks down. I’m not even binding to the created FBO, only using the previous one and the default one for rendering. But like I said, as soon as the texture binding is called for the second FBO, the first one breaks. Any ideas?

Here’s the code:

// set up the second pass' framebuffer
glBindFramebuffer(GL_FRAMEBUFFER, fbos_[1]);

// generate the texture attachment
glActiveTexture(GL_TEXTURE0);
glGenTextures(1, &texture2_);
glBindTexture(GL_TEXTURE_2D, texture2_);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, viewport[2], viewport[3], 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, texture2_, 0);

Thanks in advance!

EDIT: Nevermind! Since it was confusing to see what was going on when I had the FBO setups one after another, I created a class that encapsulated all of the details and in the process managed to fix the problem.