GLYPHMETRICSFLOAT

The MSDN docs say:
“The values of GLYPHMETRICSFLOAT are specified as notional units.”

Well then, how exactly can I convert notional units to something useable?
Something I can use in OpenGL?

Even more confusing from the MSDN docs on wglUseFontOutlines:

“The em square size of the font, the notional grid size of the original font outline from which
the font is fitted, is mapped to 1.0 in the x- and y-coordinates in the display lists.”

Try to find an example that uses them. You can always experiment with the sizes you pass to them, to see what you like.

That won’t work, because I need an exact font size on screen. What I’m looking for
is an equation to convert notional units to font point size, like 12 or something.

All the MSDN is saying is that those glyph coordinates are normalized. Scaling, rotation, and translation in 3D are completely arbitrary. Just transform the modelview to taste…

Device vs. Design Units
An application can retrieve font metrics for a physical font only after the font has
been selected into a device context. When a font is selected into a device context,
it is scaled for the device. The font metrics specific to the device are known as device units.

Portable metrics in fonts are known as design units. To apply to a specified device,
design units must be converted to device units. Use the following formula to convert
design units to device units.

DeviceUnits = (DesignUnits/unitsPerEm) * (PointSize/72) * DeviceResolution

The variables in this formula have the following meanings.

DeviceUnits: Specifies the DesignUnits font metric converted to device units. This value
is in the same units as the value specified for DeviceResolution.

DesignUnits: Specifies the font metric to be converted to device units. This value can be
any font metric, including the width of a character or the ascender value for an entire font.
unitsPerEm Specifies the em square size for the font.

PointSize: Specifies size of the font, in points. (One point equals 1/72 of an inch.)
DeviceResolution Specifies number of device units (pixels) per inch. Typical values
might be 300 for a laser printer or 96 for a VGA screen.

This formula should not be used to convert device units back to design units. Device units
are always rounded to the nearest pixel. The propagated round-off error can become very large,
especially when an application is working with screen sizes.

Ok, assuming unitsPerEm is 1.0f and DesignUnits is 1.0f:
DeviceUnits = (DesignUnits/unitsPerEm) * (PointSize/72) * DeviceResolution
DeviceUnits = (1/1) * (12/72) * 72;
DeviceUnits = 12;

Which still doesn’t make sense. Scaling my fonts by 12 do not give me a a font size of 12. :confused:

The height of the font in CreateFont() is ignored by wglCreateFontOutlines. The scaling is completely arbitrary. That’s why the glyph metrics returned are normalized, so you can conveniently scale the result any way you want.

You could easily work out a conversion from projected pixels to point size if you wanted to, but I don’t know what you would use it for…

If I wanted floating-point accuracy, is it possible to get the height of a on screen
font as a floating point value, rather than pixels? I see there’s a Win32 function
called GetCharABCWidthsFloat(), but it has nothing to do with height.

:rolleyes:

The height of the font in CreateFont() is ignored by wglCreateFontOutlines.
I need to get the height of a font (True Type or whatever) with floating-point precision.
This has nothing to do with wglCreateFontOutlines().

Then what does this have to do with opengl?

No, I don’t mean the floating-point value from GLYPHMETRICSFLOAT. These values were not derived from
precise floating point values. They were all integers divided by a huge font size of 2048.

A simple thank you will do, tibit. You don’t have to go on so…

What? I haven’t solved my problem.

Ah, nevermind. I can live without subpixel accuracy.