roguelike rendering

Hi everybody,

currently I am planning to programm a small roguelike game. I want to use OGL for several reasons and wonder how to render the world efficiently.
The problem is, that one game screen is composed of 80 x 25 small tiles (like a standard console) each containing a character. The characters are taken from a big texture(with alpha) and every tile has a fore- and backgroundcolor which could change every frame.
Since the single tiles stay where they are the whole time and just the textures and colors change, I decided to use a vertex buffer to render everything. The geometry is pretty static, the color and texturecoord array is rebuild every frame. There’s only one small improvement: if I know from the gamelogic that nothing has changed in between 2 frames I don’t recalculate the arrays and use the old ones.

I would really appreciate some comments on this and perhaps you know some tricks to improve performance or you would render it completely different?

Given the performance levels of current (and even much older) cards, you won’t see any performance problems even if you use immediate mode. They are used to drawing thousands and thousands triangles per frame, with complex shaders and whatever. So don’t bother about performance, rather just think about gameplay :wink:

Thanks for your reply Zengar.

I forgot to mention one thing. To realise different fore- and background colors I render the whole thing twice: once for the background with alpha off and second for the characters/foregroundcolor with alpha on. I am just curios if this is a good solution.

But you’re absolutly right… I should actually concentrate on making a playable game…

What is your fps performance right now btw? It won’t be hard just to throw a test app together, that would just render the scene according to your algorithm. It is pointless trying to optimize something if it already runs fast enough.

On my current machine fps is not the issue, its got a decent graphics adapter… but on the pc at my brothers place I get only around 150-170 fps. Sounds pretty much at first sight, but I want to implement a line-of-sight algorithm and some lighting and well gameplay :wink: and I know these things cost performance. If I remember correctly this pc has an old gf4 in it; I mean it runs well, it’s just that I would like to know if I am generally doing it right.

Well, as far it is above 30-40 fps, it is ok ))