View Full Version : Drawing to background frame buffer

02-24-2014, 11:41 AM
Hello all,

I'm trying to get my feet wet with using frame buffers. I've set up a frame buffer based off of a number of online tutorials, but the texture that I attached to it doesn't seem to be receiving the drawing. I like to debug these myself whenever possible, but I don't know how to check inside the frame buffer to see what's going on at runtime. Here is how I set up the frame buffer. Can anyone tell me if there is something missing?

bool Setup_FrameBuffers()
// FramebufferName, renderedTexture and depthrenderbuffer are global GLuint
pglGenFramebuffers(1, &FramebufferName);
pglBindFramebuffer(GL_FRAMEBUFFER, FramebufferName);
glGenTextures(1, &renderedTexture);

// Texture
glBindTexture(GL_TEXTURE_2D, renderedTexture);
glTexImage2D(GL_TEXTURE_2D, 0,GL_RGB, 640, 385, 0,GL_RGB, GL_UNSIGNED_BYTE, 0);
pglFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, renderedTexture, 0);
glBindTexture(GL_TEXTURE_2D, 0);

// The depth buffer
pglGenRenderbuffers(1, &depthrenderbuffer);
pglBindRenderbuffer(GL_RENDERBUFFER, depthrenderbuffer);
pglRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT, 640, 385);
pglFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, depthrenderbuffer);

GLenum DrawBuffers[1] = {GL_COLOR_ATTACHMENT0};
pglDrawBuffers(1, DrawBuffers);

return false;

pglBindFramebuffer(GL_FRAMEBUFFER, 0);
return true;

The fragment shader I use is:

#version 330

varying vec3 lighted_color;

layout(location = 0) out vec4 output_color;

void main (void)
output_color = vec4(lighted_color, 1.0);

Basically I 1) Switch to the frame buffer, 2) render the scene, 3) switch back to the screen, and 4) draw the rendered texture. Nothing appears.

If I bypass the whole thing and render directly to the scene to the screen, (without this frame buffer business), it appears fine. If I switch out a pregenerated texture for step 4, it appears fine as well, but the rendered texture never appears. Suggestions?

PS: The "p" in front of some glFunctions is just a reminder to me that I had to pull them in from extensions. They are otherwise the same glFunctions.

02-26-2014, 08:08 AM
Anyone? I just need to know if the code I presented is correct so I can narrow my debugging. I'm not sure how to examine the internals of the frame buffer during real-time. . .

02-26-2014, 02:09 PM
Seems I oversaw this thread... but don't get impatient, I have another thread with a very similar topic and noone replied for three weeks there ;)

A quick look through your code showed it's very similar to mine, some changes in the order sometimes (which should not cause the fail) and I think you even took the same tutorial as I did... so that should work. Maybe the way you draw into the buffer is wrong? Please provide code for that!

I have completely no idea what your fragment shader is supposed to do. You translate one color to another, but you should rather be drawing a quad with reasonable texture coordinates. An example from my code:

ToScreen vertex shader:

#version 330

layout(location = 0) in vec4 vVertex;
layout(location = 1) in vec2 vTex;
out vec2 texCoord;

void main()
texCoord = vTex;
gl_Position = vVertex;

ToScreen fragment shader:

#version 330

uniform sampler2D renderTexture;
in vec2 texCoord;
out vec4 color;

void main()
color = texture(renderTexture, texCoord);

VBO that's being drawn there:

Vertex_3D_tex2D fieldData[] = {
{{-1.0f, -1.0f, 0}, {0.0f, 0.0f}},
{{1.0f, -1.0f, 0}, {1.0f, 0.0f}},
{{-1.0f, 1.0f, 0}, {0.0f, 1.0f}},
{{1.0f, 1.0f, 0}, {1.0f, 1.0f}}

The issue in my other thread is a failure with the depth test... but this should not be the issue in your problem here.

02-28-2014, 10:46 AM
The drawing part of my code looks like this:

//pglBindFramebuffer(GL_DRAW_FRAMEBUFFER, 0); // switching to default buffer allows quad to show up normally
pglBindFramebuffer(GL_DRAW_FRAMEBUFFER, FramebufferName); // switch to this one to attempt to draw on a background buffer

glNormal3f( 0.0f, 0.0f, 1.0f);
glVertex3f( 1.5, -1.5, -10.5 );
glNormal3f( 0.0f, 0.0f, 1.0f);
glVertex3f( 1.5, 1.5, -10.5 );
glNormal3f( 0.0f, 0.0f, 1.0f);
glVertex3f( -1.5, 1.5, -10.5 );
glNormal3f( 0.0f, 0.0f, 1.0f);
glVertex3f( -1.5, -1.5, -10.5 );

pglBindFramebuffer(GL_FRAMEBUFFER, 0);
Draw_Texture(renderedTexture, 2.0, -2.0, -2.0, 2.0, 2.0, 1, 1, 0); // renderedTexture is supposed to be the result of background rendering, but never shows.
//Draw_Texture(bmp_working_texture, 2.0, -2.0, -2.0, 2.0, 2.0, 1, 1, 0); // bmp_working_texture is a pregenerated image that shows normally

The Draw_Texture function displays textures and works for pregenerated textures. It looks like this:

void Draw_Texture(unsigned int image, GLfloat top, GLfloat left, GLfloat bottom, GLfloat right, GLfloat z, GLfloat percent_width_used, GLfloat percent_height_used, GLfloat alpha)
glColor4f(0,0,0,alpha); // set black as alpha color
glBindTexture(GL_TEXTURE_2D, image);
glTexCoord2f(0.0f,0.0f); glVertex3f(left, top, z);
glTexCoord2f(0.0f,percent_height_used); glVertex3f(left, bottom, z);
glTexCoord2f(percent_width_used,percent_height_use d); glVertex3f(right, bottom, z);
glTexCoord2f(percent_width_used,0.0f); glVertex3f(right, top, z);

If renderedTexture is internally the same as any other texture, I can't see why Draw_Texture would work any differently. Is there some way I can check to see if the image was properly written to renderedTexture using debugging techniques at runtime? I suspect that the data is not getting correctly sent to renderedTexture.

I switched out the shader I was using for the fixed pipeline to minimize confusion.

03-26-2014, 12:41 PM
I hate to drag this back out into the light, but I still have no idea why this isn't working, and I don't know how to debug the situation. As far as I can tell, I set up the frame buffer and texture correctly, but rendering to it and then attempting to render the resulting texture yields a black screen. Can anyone please help me?