PDA

View Full Version : wglUseFontOutlines - horribly slow



angryGLfan
05-09-2002, 10:19 AM
i'm trying to use wglFontOutlines, but it seems to render at about 2 - 3 frames per second. I practically copied code from a tutorial from gametutorials.com, which runs much faster than mine. what am I doing wrong? thanks.

mikael_aronsson
05-09-2002, 11:00 PM
Hi !

Not sure what it could be, but one thing that might cause problems is if you set the tesselation to hight, this can generate a huge amount of triangles/lines (it's the 5th argument, the deviation, the closer to zero you set this value the more geometry will be generated.

That's the only thing I can think of at the moment.

Mikael

angryGLfan
05-10-2002, 10:28 AM
Thanks for reply.

No not too much geometry. Frame rate the same for half quality and each for just one character.

Any other ideas anyone?

DFrey
05-10-2002, 11:00 AM
Are you using a software OpenGL implementation? Check the GL_RENDERER string?

angryGLfan
05-10-2002, 12:58 PM
nope. using hardware renderer. flying text screensaver works fine. wglUseFontOutlines tutorials downloaded from net work fine. all other opengl programs work fine. used wglUsefontOutlines with one character and no extrusion and still only 2 - 3 fps. i just don't get it. tried on three different boxes. ???

DFrey
05-10-2002, 01:48 PM
The fact that that screensaver is using hardware acceleration does not mean your program is as well. Check the GL_RENDERER string. The fact that it is slow on multiple systems tends to suggest to me your context is using the Microsoft software renderer. Or that you are in general doing something that is causing your OpenGL implementation take a software path.

angryGLfan
05-11-2002, 08:46 AM
Thanks for all the help guys but I figured it out. I used a funtion to create the display lists and return an integer to the base list.

#define BASE CreateOutlineFonts( ... )

i used the #define to hold the value of the integer because it was going to remain constant through out the program. Then i changed it to a global variable and it works fine.

question now is, why would i need the base in a variable instead of being able to use the #define?

Omaha
05-12-2002, 06:39 AM
It looks to me like you were using BASE when drawing your letters, meaning you were also generating them on the fly while drawing... repeatedly.

If so, I'm curious how long it would run before crashing...

angryGLfan
05-12-2002, 09:07 AM
I used the #define construct outside of my rendering loop, so I shouldn't have been recreating the lists as i was going, but it did seem to act like that. after about 5 -10 seconds it got even slower. i never let it get to the point of crashing but i always had to "end task" it or i couldn't get it to close.

bakery2k
05-12-2002, 03:03 PM
Using the #define WILL call CreateOutlineFonts each frame.
Remember that the preprocessor simply does a text substitution for #define.
If you had:

for(;; )
{
...
glListBase(BASE);
}

it will be converted to:

for(;; )
{
...
glListBase(CreateOutlineFonts(...));
}

EDIT: Stupid Smileys http://www.opengl.org/discussion_boards/ubb/smile.gif

[This message has been edited by bakery2k (edited 05-12-2002).]

angryGLfan
05-13-2002, 07:05 AM
I didn't realize that it was just a strict text substitution. That explains alot. Thank you.