PDA

View Full Version : Does OpenGL support instancing?



MarkS
08-18-2009, 06:37 PM
I'm working on a maze game (I know, not very original...). The game will start out in a 3x3 cell level and each level after that grows in size. Making a maze in 3D (actually 2.5D) is trivial, but I do not want to limit the maximum level size.

If you look at a 2D maze, you can see five different cell types. Using a plumbing analogy, there are tees, crosses, caps, coupling and elbows. What I want to do is load one one instance of each cell type and only process them when they are called for. The actual level data structure would only contain the cell type, its orientation and posistion. If each cell type is, say, 1MB in size, then a 512x512 level will occupy over 262,000 MB of RAM if the levels are constructed normally with each cell containing a copy of its polygons. This is obviously not an option.

I know that DirectX has instancing built in, but what is there like it in GL? If GL does not have such a feature, what is the best way to go about it?

I'm sorry if my description is not very clear. I'm at work and don't have access to my files. I can post pics later tonight.

Alfonse Reinheart
08-18-2009, 07:04 PM
OpenGL supports two mechanisms for instancing, based on two extensions. One of these is incorporated into GL 3.0's core.

These two extensions match the two kinds of instancing that D3D has historically supported. D3D9's instancing, based on divisors, is implemented in GL_ARB_instance_arrays. It is not part of GL 3.0 core, and most hardware that implements GL 3.0 does not implement it (for the same reason that D3D10 doesn't support it).

The other extension is GL_ARB_draw_instanced. It is based on D3D10 style instancing, which is shader-driven.

MarkS
08-18-2009, 07:10 PM
OK, what about under GL 2.1? I need this to run on Macs and PCs, so I'm limited by Apple's drivers. I believe that they still only support GL 2.1? Regardless, I'd like to hold off on a 3.0 implementation for the time being.

Alfonse Reinheart
08-18-2009, 09:52 PM
They're extensions. Extensions can be supported on any version.

Now, whether they actually are supported is another question.

todayman
08-18-2009, 10:52 PM
I don't believe that Apple has added support for this extension. It's not even listed on their list of capabilities (http://developer.apple.com/graphicsimaging/opengl/capabilities/).

I've been itching to get my hands on 3.1 and now 3.2, but I guess I'll have to wait or get something going with Linux...

Alfonse Reinheart
08-18-2009, 11:49 PM
Apple's dragging their feet more than ATI. And that's saying something.

Heiko
08-19-2009, 12:16 AM
I thought `Mac OS X snow leopard' (coming in September) would start supporting OpenGL 3.x (don't know which specific version).

ZbuffeR
08-19-2009, 05:15 AM
If you really have a lot of triangles per block, use one VBO for each unique block.

Dark Photon
08-19-2009, 06:17 AM
OK, what about under GL 2.1? I need this to run on Macs and PCs, so I'm limited by Apple's drivers. I believe that they still only support GL 2.1? Regardless, I'd like to hold off on a 3.0 implementation for the time being.
NVidia has supported EXT/ARB_draw_instanced in their drivers since at least last October, on Linux and MSWin. And yes, under OpenGL 2.1 as well as OpenGL 3.x. You just have to have a GeForce 8 or better card, of course.

Stuart McDonald
08-19-2009, 09:31 AM
I don't think you need instancing for what you want to do. As ZBuffer says just use a VBO for each unique cell.

Imagine if your were drawing a crowd of identical monsters, you just draw the same monster multiple times at different positions and orientations. That's just normal rendering.

Brolingstanz
08-19-2009, 11:37 AM
How fancy are the cells? Could be that a geo shader would be enough the send your maze this way and that. It might be fun to implement the Pipes screen saver that a way...