Cockpit/UserInterface

Hi. We are programming a game and need a cockpit around our 3D scene to interact with the user. We have tried to use the stencil buffer and it gives us the desired result, but it is so d***n slow. We would really appreaciate it if anyone could help us with some faster solutions to this problem.

Ok assuming you are currently using Perspective Projection the easiest solution would be to

Switch to Orthographic Projection,

Render a texture mapped object right up close to the eye position (this object is your internal view - dashboard, cockpit or whatever.

Switch back to perspective projection when you draw your scene.

Hope to help
Dans

Decided to do that, will see if it works out by tomorrow. Thanks for the tip.

Strange that you say stencilbuffering is slow … it’s almost for free at most cards and I haven’t experienced any performance penalty by using it (at least not after I got a working driver that supports stenciling, otherwise it’s very slow). Make sure you do not have a 32 bit depthbuffer, it should be 24bits and 8bit stencil to fit stencil+depth in 32 bits.

> it’s almost for free at most cards
Have you tried it at the Voodoo3?

Nothing is free…

Make sure you do not have a 32 bit depthbuffer, it should be 24bits and 8bit stencil…
Are you saing, a 32bit (well, 24+8) depthbuffer has no performance penalty in comparison with 16bit without stencil?
btw, on some 3D cards (for exemple, TNT/2) 32bit Z-buffer can’t work with 16bit color buffer, only with 32bit.
It is a little bit slower, isn’t it?

I believe that it’s more a matter of memory usage than speed. It probably is a little slower, but the extra precision is usually worth it if it’s supported…

SergeK, well … since the Voodoo3 doesn’t support stencilbuffers (and sux in general, aswell as the upcoming Voodoo4/5 is going to do) you’ll get software rendering … but at cards that support it it’s almost for free. I ran a 30fps with and 30 fps without …
And who cares about 16bit colorbuffers (except those who swallows 3dfx propaganda)??
It’s ugly and obsolete. I wouldn’t even consider using 16bit colors in either my own apps or when playing games.

Yes, 16bit color buffer looks quite bad, especially with multipass rendering.
But in the fill-rate limited situations you may have no other choices.
Not everybody have GeForce, TNT2, G400 or even Rage128.
(btw, Rage128 in true-color is definitely faster with 16bit Z-buffer).

Also 16bit color buffer can be used with supersampling.
2x2 downsampling almost completely suppress dithering artefacts.
Of course, downsampled image should be 24bit.
(Don’t know about GeForce. If it downsample 16->16 - that’s bad…)

I agree, Hummus. I won’t do anything special with my code to make it run on voodoos. It’s 3Dfx’s problem that both their cards and their drivers bite, not mine

I guess, your product is not for sale…

Even if it was, I wouldn’t. I think 3Dfx desperately needs to get their act back together.