PDA

View Full Version : wglUseFontBitmaps Annoyance



Omaha
01-11-2002, 10:45 AM
void CreateFonts()
{
HFONT TempFont;
char Temp[40];

TempFont=CreateFont(16, 0, 0, 0, FW_NORMAL, 0, 0, 0, ANSI_CHARSET, OUT_DEFAULT_PRECIS, CLIP_DEFAULT_PRECIS, DEFAULT_QUALITY, DEFAULT_PITCH, "arial");
SelectObject(MainHDC, TempFont);

FontDisplayLists=new uint[NUM_FONTS];
for(ushort n=0; n<NUM_FONTS; n++)
FontDisplayLists[n]=glGenLists(256);

if(FALSE==wglUseFontBitmaps(MainHDC, 0, 255, FontDisplayLists[FONT_DEFAULT]))
{
sprintf(Temp, "How very odd... (Error %u)", GetLastError());
MessageBox(NULL, Temp, "Font Failure", 0);
}

DeleteObject(TempFont);
}

...where MainHDC is the HDC which houses the main OpenGL RC. It's properly initialized and defined as an extern in the source file where this snippet originated.

The error I'm getting is error 87, which according to MSDN is an invalid parameter error, but for the life of me I can't see what's wrong with the wgl call. If there's nothing wrong here, I'll have to peruse other parts of my code, just curious if anyone saw something blatantly flawed in this call.

NUM_FONTS is 1, and FONT_DEFAULT is 0, to answer those questions before they came up. FontDisplayLists is just a plain old unsigned integer pointer, which is NULL when the function is run initially (it's only done once during the whole program)and the new call seems to work properly (based on a quick check.)

Any minor syntax errors are probably do to my cut n paste!