PDA

View Full Version : Does OpenGL deallocate the resources for me?, With instanced rendering



Omar Bsoul
10-14-2017, 12:15 AM
I was trying to make an editing application based on OpenGL, and I have a vbo that hold the transformation matrices in it and then I use instanced rendering to draw all the instances for me, so if the user desired to add a new object; I have to resize the buffer, how would I do that?
By deleting the old buffer and generate new one then fill it with old data and add the new data?
Or just call glbufferdata without worrying about the buffer size?
Or there is a better idea to do what I wanted to?

john_connor
10-14-2017, 02:06 AM
I was trying to make an editing application based on OpenGL, and I have a vbo that hold the transformation matrices in it and then I use instanced rendering to draw all the instances for me, so if the user desired to add a new object; I have to resize the buffer, how would I do that?

that ("user input") means the buffer doesnt have to be resized that frequently ...
you can pre-allocate a buffer large enough that can hold all the comming transformation matrices (100 or so). if you need more matrices, just re-allocate (lets say double) its size for the time the app runs, by calling:

glNamedBufferData(mybuffer, new_size_here, new_data_here, GL_DYNAMIC_DRAW);



By deleting the old buffer and generate new one then fill it with old data and add the new data?

no, you can keep the old one



Or just call glbufferdata without worrying about the buffer size?

of course, the buffer must be large enough to hold all the memory ...
just check:

if (new_minimum_size > current_size)
{
current_size *= 1.5f;
glNamedBufferData(mybuffer, current_size, nullptr /* fill it later */, GL_DYNAMIC_DRAW);
}
to specify the buffer content later, use glNamedBufferSubData(...) of glMapNamedBuffer(...)


as long as you call 1x glCreateBuffers(...) (and glDeleteBuffers() when finished), there wont be a resource leak. also, all the resources are freed when destroying the GL context

Alfonse Reinheart
10-14-2017, 08:45 AM
Don't resize buffers. Instead, have a maximum size, and if the desired size exceeds it, then you're rendering too many instances and you have to decide how to handle that.

Omar Bsoul
10-14-2017, 12:19 PM
Thanks john for your reply,

could you please ckeck my sample code, is it right?

glGenBuffers(1, &vbo_id);
glBindBuffer(GL_ARRAY_BUFFER, vbo_id);
glBufferData(GL_ARRAY_BUFFER, NumberOfBytes, nullptr, GL_DYNAMIC_DRAW);
//...
// some process for adding data and setting the attribpointers....
//...
glBindBuffer(GL_ARRAY_BUFFER, 0);
// and when the buffer is full -->
glNamedBufferData(vbo_id, NewNumberOfBytes, nullptr, GL_DYNAMIC_DRAW);




as long as you call 1x glCreateBuffers(...) (and glDeleteBuffers() when finished)
actually I am using glGenBuffers(...), so; is glDeleteBuffers() valid for the generated buffer?
and what is the difference between glGenBuffers(...) and glCreateBuffers(...)?


all the resources are freed when destroying the GL context
could you please explain how GL context being destroyed?

john_connor
10-14-2017, 03:16 PM
could you please ckeck my sample code, is it right?

yes



actually I am using glGenBuffers(...), so; is glDeleteBuffers() valid for the generated buffer?
and what is the difference between glGenBuffers(...) and glCreateBuffers(...)?

glGenBuffers() creates buffer "handles", these dont exist until you bind the to a target the first time, glCreateBuffers() creates existing buffers (without having to bind them), glDeleteBuffers() deletes all buffer "handles" (regardless how they were created, glGen* or glCreate*)



could you please explain how GL context being destroyed?

when your GL window gets destroyed, the GL context gets also destroyed, and with it all the buffers / textures / etc

GClements
10-14-2017, 08:08 PM
could you please explain how GL context being destroyed?
This is normally done by the toolkit (GLUT, GLFW, Qt, etc). Most toolkits automatically create one or more contexts whenever you create a window and destroy them when the window is destroyed.

If you're using the platform's native OpenGL API directly, contexts are destroyed with glXDestroyContext(), wglDeleteContext() or eglDestroyContext().

Note that if contexts are part of a sharing group, buffers are shared between contexts in the group, so they're only destroyed when the last context in the group is destroyed.

Omar Bsoul
10-16-2017, 09:36 AM
Thanks to you all, you really helped me.