PDA

View Full Version : glCallLists and Unicode



PleasantStorm
02-18-2015, 12:16 PM
I recently compiled some of my older pre-unicode code, and found the string type had been upgraded to support unicode while the opengl code was still set up to handle Ansi characters.

Just to share a bit about this topic, if you are using display lists to build your font, when you make your subsequent call to glCallLists, be sure to set the type to GL_UNSIGNED_SHORT to have your characters display property. I read another forum post that covered this topic but the topic was closed (circa 2001). So this is just a heads up to those still working with wgl font functions and glCallLists to display them.

void glCallLists(
GLsizei n,
GLenum type, // set this parameter to GL_UNSIGNED_SHORT when you make your call
const GLvoid *lists
);

I use it in Embarcadero Delphi (XE4) but this should apply to any compiler or script language that supports the OpenGL headers and has native support for unicode.

Hope this post isn't too antiquated for all you opengl pros out there. ;-)

Have a great day
Brian Joseph Johns

Alfonse Reinheart
02-18-2015, 12:36 PM
It's not antiquated; it's dangerously inaccurate.

Unicode is not a 2-byte character encoding. It used to be, sometime in the paleolithic era, but it hasn't been in over a decade. UTF-16 is not 2-bytes either. (https://en.wikipedia.org/wiki/UTF-16) Unicode uses a 21-bits for code points, so both UTF-16 and UTF-8 require multiple code units (16-and-8 bit values) to encode the entire breath of Unicode.

Furthermore, 1 codepoint is not 1 visible character. Transforming a sequence of codepoints into a sequence of glyphs is not trivial. It is not a 1:1 correspondence, due to things like combining characters (https://en.wikipedia.org/wiki/Combining_character), complex formatting (https://en.wikipedia.org/wiki/Unicode#Ligatures), and other issues.

So if you think you can just shove a "Unicode string" at glCallLists and your code "supports Unicode", you are wrong.

dukey
02-24-2015, 04:37 AM
The windows API predates modern unicode, so 2 byte chars are what we are stuck with.

Alfonse Reinheart
02-24-2015, 06:30 AM
The windows API predates modern unicode, so 2 byte chars are what we are stuck with.

While Windows did start with Unicode support as UCS-2, they quickly switched it to UTF-16. And that was well over a decade ago (Win2000). So again, if you're planning to shove a Windows-provided string at this thing and expect it to do something reasonable, that's not a valid expectation.

Supporting Unicode requires supporting Unicode. And that means everything that comes with it.

Shinta
02-25-2015, 03:07 AM
Supporting Unicode requires supporting Unicode. And that means everything that comes with it.
So, what do you suggest one should use these days to display unicode in OpenGL? glCallLists is obviously not helpful anymore (a 2 mil. chararcter display list? and deprecated is it too).

Use FreeType (or similar) to 'render' a textures of the texts you need? Or a giant (sparse-)texture with all possibly necessary chars to pick from, while rendering? Or what?

Agent D
02-25-2015, 04:51 AM
First of all, you have to find out what you are actually trying to accomplish. I can't think of any application that will ever need all unicode code points at the same time.

If you want to use display lists, load characters on the fly as you need them, or pre-load them if you know what you need.
Same goes for caching glyphs in a texture or caching entire text strings.

dukey
02-25-2015, 06:03 AM
So, what do you suggest one should use these days to display unicode in OpenGL? glCallLists is obviously not helpful anymore (a 2 mil. chararcter display list? and deprecated is it too).

Use FreeType (or similar) to 'render' a textures of the texts you need? Or a giant (sparse-)texture with all possibly necessary chars to pick from, while rendering? Or what?

Just cache the values you use, no need to create 2 million textures.

GClements
02-25-2015, 11:54 AM
So, what do you suggest one should use these days to display unicode in OpenGL? glCallLists is obviously not helpful anymore (a 2 mil. chararcter display list? and deprecated is it too).

Use FreeType (or similar) to 'render' a textures of the texts you need? Or a giant (sparse-)texture with all possibly necessary chars to pick from, while rendering? Or what?
In the general case, use Pango or native OS functions to render complete strings into a bitmap which is then uploaded to a texture.

The glCallLists() approach can only work with "linear" scripts, and only when using pre-composed glyphs. And even then, it can't handle kerning.

Alfonse Reinheart
02-25-2015, 01:15 PM
It should be noted that Pango is a text layout library, not a text rendering library. It tells you which glyphs you need and where they go; it doesn't put them there.

GClements
02-25-2015, 04:57 PM
It should be noted that Pango is a text layout library, not a text rendering library. It tells you which glyphs you need and where they go; it doesn't put them there.

Pango includes functions to render the layouts it creates, using Win32, FreeType, Xft, CoreText, or cairo.