PDA

View Full Version : w'hy do i need glClear(GL_DEPTH_BUFFER_BIT)



pasire
01-25-2015, 05:09 AM
here is the code:

#include <GL/glew.h> // include GLEW and new version of GL on Windows
#include <GLFW/glfw3.h> // GLFW helper library
#include <stdio.h>

int main () {
// start GL context and O/S window using the GLFW helper library
if (!glfwInit ()) {
fprintf (stderr, "ERROR: could not start GLFW3\n");
return 1;
}

// uncomment these lines if on Apple OS X
glfwWindowHint (GLFW_CONTEXT_VERSION_MAJOR, 4);
glfwWindowHint (GLFW_CONTEXT_VERSION_MINOR, 1);
glfwWindowHint (GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);
glfwWindowHint (GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);

GLFWwindow* window = glfwCreateWindow (640, 480, "Hello Triangle", NULL, NULL);
if (!window) {
fprintf (stderr, "ERROR: could not open window with GLFW3\n");
glfwTerminate();
return 1;
}
glfwMakeContextCurrent (window);

// start GLEW extension handler
glewExperimental = GL_TRUE;
glewInit ();

// get version info
const GLubyte* renderer = glGetString (GL_RENDERER); // get renderer string
const GLubyte* version = glGetString (GL_VERSION); // version as a string
printf ("Renderer: %s\n", renderer);
printf ("OpenGL version supported %s\n", version);

// tell GL to only draw onto a pixel if the shape is closer to the viewer
glEnable (GL_DEPTH_TEST); // enable depth-testing
glDepthFunc (GL_LESS); // depth-testing interprets a smaller value as "closer"

/* OTHER STUFF GOES HERE NEXT */

float points[] = {
0.0f, 0.5f, 0.0f,
0.5f, -0.5f, 0.0f,
-0.5f, -0.5f, 0.0f
};
GLuint vbo = 0;
glGenBuffers (1, &vbo);
glBindBuffer (GL_ARRAY_BUFFER, vbo);
glBufferData (GL_ARRAY_BUFFER, 9 * sizeof (float), points, GL_STATIC_DRAW);
GLuint vao = 0;
glGenVertexArrays (1, &vao);
glBindVertexArray (vao);
glEnableVertexAttribArray (0);
glBindBuffer (GL_ARRAY_BUFFER, vbo);
glVertexAttribPointer (0, 3, GL_FLOAT, GL_FALSE, 0, NULL);
const char* vertex_shader =
"#version 410\n"
"layout(location = 0) in vec4 vPosition;"
"void main () {"
" gl_Position = vPosition;"
"}";
const char* fragment_shader =
"#version 410\n"
"out vec4 frag_colour;"
"void main () {"
" frag_colour = vec4 (0.5, 0.0, 0.5, 1.0);"
"}";
GLuint vs = glCreateShader (GL_VERTEX_SHADER);
glShaderSource (vs, 1, &vertex_shader, NULL);
glCompileShader (vs);
GLuint fs = glCreateShader (GL_FRAGMENT_SHADER);
glShaderSource (fs, 1, &fragment_shader, NULL);
glCompileShader (fs);
GLuint shader_programme = glCreateProgram ();
glAttachShader (shader_programme, fs);
glAttachShader (shader_programme, vs);
glLinkProgram (shader_programme);
while (!glfwWindowShouldClose (window)) {
// wipe the drawing surface clear
glClear (GL_DEPTH_BUFFER_BIT);
const GLfloat color[]={0.0,0.2,0.0,1.0};
//glClearBufferfv(GL_COLOR,0,color);
glUseProgram (shader_programme);
glBindVertexArray (vao);
// draw points 0-3 from the currently bound VAO with current in-use shader
glDrawArrays (GL_TRIANGLES, 0, 3);
// update other events like input handling
glfwPollEvents ();
// put the stuff we've been drawing onto the display
glfwSwapBuffers (window);
} // close GL context and any other GLFW resources
glfwTerminate();
return 0;
}
when i comment line "glClear(GL_DEPTH_BUFFER_BIT)",the window did not display anything,does it count?

Agent D
01-25-2015, 05:20 AM
Using a depth buffer is a technique to handle overlapping triangles. Your OpenGL(R) framebuffer has a 640 * 480 pixel color buffer, holding
color values for each pixel and an additional 640*480 pixel depth buffer, holding distance values.

When you render triangles (or other primitives), the depth of the vertices are interpolated across the triangles and when the final fragments
are written to the frame buffer, the depth value is written to the depth buffer, the color value to the color buffer.

When depth-test is enabled, the fragments are only written when their depth value fits a certain criteria (e.g. be smaller than the value already
in the depth buffer). This way, when two triangles overlap, the framgents of the one closer to the viewer overwrites the framents of the one
farther away, but the framents of the triangle farther away are discarded and don't overwrite the ones of the triangle closer, independend of
the order that they are drawn in.

The glClear( GL_DEPTH_BUFFER_BIT ) clears all depth buffer pixels to an initial value (e.g. 1.0 for maximum depth). If you don't do this,
fragments of a new triangle are compared against the ones from the previous frame.



This is very basic stuff. Please consult a beginner OpenGL(R) tutorial.

pasire
01-25-2015, 06:14 AM
the first example in OpenGL programming guide 8th does not have this function in it to draw triangles,i am confused

Alfonse Reinheart
01-25-2015, 07:27 AM
the first example in OpenGL programming guide 8th does not have this function in it to draw triangles,i am confused

It also probably doesn't use a depth buffer or depth testing, so it doesn't care what the depth value of the triangle it draws is.

pasire
01-25-2015, 07:36 AM
how do i know whether it use a depth buffer or not

Alfonse Reinheart
01-25-2015, 07:51 AM
Unless you're doing copy-and-paste coding, it's impossible to accidentally use the depth buffer. Since you have to deliberately enable depth testing. So if you enabled depth testing, then you're trying to use the depth buffer and thus should set it to a value.

And if you're doing copy-and-paste coding... stop it!

pasire
01-25-2015, 08:49 AM
i have figure it out ,thanks