Part of the Khronos Group
OpenGL.org

The Industry's Foundation for High Performance Graphics

from games to virtual reality, mobile phones to supercomputers

Results 1 to 7 of 7

Thread: Does OpenGL deallocate the resources for me?, With instanced rendering

  1. #1
    Junior Member Newbie
    Join Date
    Mar 2017
    Posts
    5

    Does OpenGL deallocate the resources for me?, With instanced rendering

    I was trying to make an editing application based on OpenGL, and I have a vbo that hold the transformation matrices in it and then I use instanced rendering to draw all the instances for me, so if the user desired to add a new object; I have to resize the buffer, how would I do that?
    By deleting the old buffer and generate new one then fill it with old data and add the new data?
    Or just call glbufferdata without worrying about the buffer size?
    Or there is a better idea to do what I wanted to?

  2. #2
    Member Regular Contributor
    Join Date
    May 2016
    Posts
    435
    Quote Originally Posted by Omar Bsoul View Post
    I was trying to make an editing application based on OpenGL, and I have a vbo that hold the transformation matrices in it and then I use instanced rendering to draw all the instances for me, so if the user desired to add a new object; I have to resize the buffer, how would I do that?
    that ("user input") means the buffer doesnt have to be resized that frequently ...
    you can pre-allocate a buffer large enough that can hold all the comming transformation matrices (100 or so). if you need more matrices, just re-allocate (lets say double) its size for the time the app runs, by calling:
    Code :
    glNamedBufferData(mybuffer, new_size_here, new_data_here, GL_DYNAMIC_DRAW);


    Quote Originally Posted by Omar Bsoul View Post
    By deleting the old buffer and generate new one then fill it with old data and add the new data?
    no, you can keep the old one


    Quote Originally Posted by Omar Bsoul View Post
    Or just call glbufferdata without worrying about the buffer size?
    of course, the buffer must be large enough to hold all the memory ...
    just check:
    Code :
    if (new_minimum_size > current_size)
    {
    	current_size *= 1.5f;
    	glNamedBufferData(mybuffer, current_size, nullptr /* fill it later */, GL_DYNAMIC_DRAW);
    }
    to specify the buffer content later, use glNamedBufferSubData(...) of glMapNamedBuffer(...)


    as long as you call 1x glCreateBuffers(...) (and glDeleteBuffers() when finished), there wont be a resource leak. also, all the resources are freed when destroying the GL context
    Last edited by john_connor; 10-14-2017 at 02:23 AM.

  3. #3
    Senior Member OpenGL Lord
    Join Date
    May 2009
    Posts
    5,928
    Don't resize buffers. Instead, have a maximum size, and if the desired size exceeds it, then you're rendering too many instances and you have to decide how to handle that.

  4. #4
    Junior Member Newbie
    Join Date
    Mar 2017
    Posts
    5
    Thanks john for your reply,

    could you please ckeck my sample code, is it right?
    Code :
    glGenBuffers(1, &vbo_id);
    glBindBuffer(GL_ARRAY_BUFFER, vbo_id);
    glBufferData(GL_ARRAY_BUFFER, NumberOfBytes, nullptr, GL_DYNAMIC_DRAW);
    //...
    // some process for adding data and setting the attribpointers....
    //...
    glBindBuffer(GL_ARRAY_BUFFER, 0);
    // and when the buffer is full -->
    glNamedBufferData(vbo_id, NewNumberOfBytes, nullptr, GL_DYNAMIC_DRAW);


    Quote Originally Posted by john_connor View Post
    as long as you call 1x glCreateBuffers(...) (and glDeleteBuffers() when finished)
    actually I am using glGenBuffers(...), so; is glDeleteBuffers() valid for the generated buffer?
    and what is the difference between glGenBuffers(...) and glCreateBuffers(...)?

    Quote Originally Posted by john_connor View Post
    all the resources are freed when destroying the GL context
    could you please explain how GL context being destroyed?

  5. #5
    Member Regular Contributor
    Join Date
    May 2016
    Posts
    435
    Quote Originally Posted by Omar Bsoul View Post
    could you please ckeck my sample code, is it right?
    yes


    Quote Originally Posted by Omar Bsoul View Post
    actually I am using glGenBuffers(...), so; is glDeleteBuffers() valid for the generated buffer?
    and what is the difference between glGenBuffers(...) and glCreateBuffers(...)?
    glGenBuffers() creates buffer "handles", these dont exist until you bind the to a target the first time, glCreateBuffers() creates existing buffers (without having to bind them), glDeleteBuffers() deletes all buffer "handles" (regardless how they were created, glGen* or glCreate*)


    Quote Originally Posted by Omar Bsoul View Post
    could you please explain how GL context being destroyed?
    when your GL window gets destroyed, the GL context gets also destroyed, and with it all the buffers / textures / etc

  6. #6
    Senior Member OpenGL Guru
    Join Date
    Jun 2013
    Posts
    2,477
    Quote Originally Posted by Omar Bsoul View Post
    could you please explain how GL context being destroyed?
    This is normally done by the toolkit (GLUT, GLFW, Qt, etc). Most toolkits automatically create one or more contexts whenever you create a window and destroy them when the window is destroyed.

    If you're using the platform's native OpenGL API directly, contexts are destroyed with glXDestroyContext(), wglDeleteContext() or eglDestroyContext().

    Note that if contexts are part of a sharing group, buffers are shared between contexts in the group, so they're only destroyed when the last context in the group is destroyed.

  7. #7
    Junior Member Newbie
    Join Date
    Mar 2017
    Posts
    5
    Thanks to you all, you really helped me.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •