PDA

View Full Version : Getting 2D Graphics into OpenGL



lpetrich
03-28-2000, 10:22 PM
Some games have both 2D and 3D graphics, with the 2D graphics usually being some kind of status display. This is no problem for either software rendering or OpenGL-in-a-window rendering on a 2D/3D card.

However, some cards do not support OpenGL in a window, such as the older 3dfx ones, and I'm asking what might be the best way to get the 2D graphics displayed.

One way to draw the 2D parts in an offscreen buffer and then blit them to the screen with glDrawPixels(). This could be made smarter by only blitting the updated parts.

How well does that solution work in practice? I'm especially interested in considering that as a source-code improvement for Bungie's Marathon engine (http://source.bungie.org).

Hude
03-29-2000, 05:31 AM
Use fullscreen window to get hw acceleration for 3dfx. Don't use glDrawPixels, instead draw textured quads.

lpetrich
03-30-2000, 11:28 PM
So it would be necessary to create some texture tiles and then use them as intermediate buffers?

It would be especially tricky for 3dfx cards, I suppose, with their 256*256 limit.

BTW, how to OpenGL drivers for 3dfx cards usually handle big textures?

Hude
03-31-2000, 02:26 AM
So it would be necessary to create some texture tiles and then use them as intermediate buffers?

Yup.


It would be especially tricky for 3dfx cards, I suppose, with their 256*256 limit.

Well, if you really need 1:1 images. With a bit of quality loss you can scale the textures if the original dimension are not power of two.


BTW, how to OpenGL drivers for 3dfx cards usually handle big textures?

I guess they are first scaled so that width and height are power of two. And the maximum size is 256x256 as you said.

Also, using glDrawPixels makes your fps crawl...

[This message has been edited by Hude (edited 03-31-2000).]

AndersO
03-31-2000, 02:34 AM
On 3dfx 256*256 issue..

Either you subdivide your polygons to the point where 256*256 "fits", or you downsize the orginal texture to 256*256.

TheGecko
04-09-2000, 08:42 AM
You see...this is the thing that pisses me off about OpenGL...the fact that you can't efficiently mix 2D bitmaps with 3D scenes.This totally blows! If I want to use bitmaps then I should! I don't have to resort to using textured polygons.

Forgive me if I sould aggressive,but I have spent alot of time studying OpenGL only to come to something this trivial and find out I can't mix 2D bitmaps with my 3D scene. Now I have to resort to using DirectX which means I have to spend MORE time studying another API and then I have to convert ALL my code....Unless of course the next release of OpenGL will fix this whole bitmap mess, I really don't have any other choice.

kel
04-09-2000, 08:42 PM
I can understand your frustration, but I wouldn't say that this is a "bug" in OpenGL, it just helps keep everything a little more homogeneous. If you do move to DirectX, please keep in mind that it is highly recommended that, when using 3D, you should draw 2D bitmaps on polygons (same as OpenGL). The reason for this is that the switch from 3D to 2D and back is fairly expensive, so it's much more efficient to stay in the Direct3D mode whenever possible. The technical details are explained somwhere in the depths of the D3D docs. So, moving to DirectX won't make your life any easier (if you do things "right"). I'm assuming that this expensive mode switching in the reason OpenGL works this way also. The way I get around it in C++ is to create a class that knows how to draw bitmaps as textured polygons and then I only deal with that class...