3D font with unicode through wglUseFontOutlines ?

Hello!

I have a 3D font class which I have adapted from Nehe, and now i am adding multilingual support :slight_smile:
I decided to use this method instead of bitmap fonts.
The Latin characters work fine, but there is an offset in the mapping of Unicode input and wglUseFontOutlines.

In particular for Greek characters start from unicode value 900. I generate lists using wglUseFontOutlines from 900 to 976. There is a mismatch in the fonts generated by a value of 48.

e.g the ‘A’ (alpha) character has unicode 913 but wglUseFontOutlines produces the correct glyph at value 913+48.

I have solved this by adding a ‘remap’ variable, but I dont understand wy they dont match.

Anyone knows what I might be doing wrong?

 

void CFont3dGLUnicode::CreateFontInternal(char* fontname, int siz, int firstchar, int numlists, int charset)
{
	HFONT	font;						
	HDC		hdc = wglGetCurrentDC();  

	glListBase(0);	// reset
	int liststart = glGenLists(numlists);
	// load the font
	font = CreateFont(	-siz,							// Height Of Font
						0,								// Width Of Font
						0,								// Angle Of Escapement
						0,								// Orientation Angle
						FW_BOLD,						// Font Weight
						FALSE,							// Italic
						FALSE,							// Underline
						FALSE,							// Strikeout
						charset,					// Character Set Identifier
						OUT_TT_PRECIS,					// Output Precision
						CLIP_DEFAULT_PRECIS,			// Clipping Precision
						ANTIALIASED_QUALITY,			// Output Quality
						FF_DONTCARE|DEFAULT_PITCH,		// Family And Pitch
						fontname);				// Font Name

	if (!font)	return;

	// Selects The Font
	SelectObject(hdc, font);

	// build the font using the custom wgl function
	BOOL ret = wglUseFontOutlines(	hdc,				// Select The Current DC
						firstchar,								// Starting Character
						numlists,								// Number Of Display Lists To Build
						liststart,				// Starting Display Lists
						0.002f,							// Deviation From The True Outlines
						1.0, //thickness,				// Font Thickness In The Z Direction
						WGL_FONT_POLYGONS,				// Use Polygons, Not Lines
						&m_gmf[firstchar]);							// Address Of Buffer To Recieve Data
	DWORD err=0;
	if (!ret)
	{
		memset(&m_Unicode[firstchar], 0, numlists*sizeof(int));
		err = GetLastError();
		AppendFile("debug.txt","Error Generating Font : %0x
",err);
	}
	else
	{
		for(int i=0; i<numlists; ++i)
			m_Unicode[firstchar+i] = liststart+i;
	}
	DeleteObject(font);
}

 
void CFont3dGLUnicode::CreateFontGreek(char* fontname, int siz)
{
	CreateFontInternal(fontname, siz, 900, 77, GREEK_CHARSET);
}
void CFont3dGLUnicode::CreateFontCyrilic(char* fontname, int siz)
{
	CreateFontInternal(fontname, siz, 1024, 95, EASTEUROPE_CHARSET);
}
void CFont3dGLUnicode::CreateFontHebrew(char* fontname, int siz)
{
	CreateFontInternal(fontname, siz, 1456, 68, HEBREW_CHARSET);
}
void CFont3dGLUnicode::CreateFontArabic(char* fontname, int siz)
{
	CreateFontInternal(fontname, siz, 1536, 255, ARABIC_CHARSET);
}

If your application is not compiled for UNICODE, you may be running into a problem converting the multibyte character string. If you need to use international character sets, you should only use wide charater data types and use the “wglUseFontOutlinesW” method directly. The default “wglUseFontOutlines” method is mapped to either “wglUseFontOutlinesA” or “wglUseFontOutlinesW” depending on your UNICODE setting.

right, I was using wstring and some other CUnicodeString class… but yes changing to wglUseFontOutlinesW has solved the problem!
I didn’t even suspect of a function variant since there are no wide-char or unicode function arguments.

thanx for your help