OGRE  13.6
Object-Oriented Graphics Rendering Engine
Font Definition Scripts

Ogre uses texture-based fonts to render the Ogre::TextAreaOverlayElement. You can also use the Ogre::Font object for your own purpose if you wish. The final form of a font is a Ogre::Material object generated by the font, and a set of ’glyph’ (character) texture coordinate information.

There are 2 ways you can get a font into OGRE:

  1. Design a font texture yourself using an art package or font generator tool
  2. Ask OGRE to generate a font texture based on a truetype font

The former gives you the most flexibility and the best performance (in terms of startup times), but the latter is convenient if you want to quickly use a font without having to generate the texture yourself. I suggest prototyping using the latter and change to the former for your final solution.

All font definitions are held in .fontdef files, which are parsed by the system at startup time. Each .fontdef file can contain multiple font definitions. The basic format of an entry in the .fontdef file is:

font <font_name>
{
type <image | truetype>
source <image file | truetype font file>
...
... custom attributes depending on type
}

Using an existing font texture

If you have one or more artists working with you, no doubt they can produce you a very nice font texture. OGRE supports full colour font textures, or alternatively you can keep them monochrome / greyscale and use TextArea’s colouring feature. Font textures should always have an alpha channel, preferably an 8-bit alpha channel such as that supported by TGA and PNG files, because it can result in much nicer edges. To use an existing texture, here are the settings you need:

Parameters
typeimage This just tells OGRE you want a pre-drawn font.
source<filename> This is the name of the image file you want to load. This will be loaded from the standard resource locations and can be of any type OGRE supports, although JPEG is not recommended because of the lack of alpha and the lossy compression. I recommend PNG format which has both good lossless compression and an 8-bit alpha channel.
glyph<character> <u1> <v1> <u2> <v2> This provides the texture coordinates for the specified character. You must repeat this for every character you have in the texture. The first 2 numbers are the x and y of the top-left corner, the second two are the x and y of the bottom-right corner. Note that you really should use a common height for all characters, but widths can vary because of proportional fonts. ’character’ is either an ASCII character for non-extended 7-bit ASCII, or for extended glyphs, a unicode decimal value, which is identified by preceding the number with a ’u’ - e.g. ’u0546’ denotes unicode value 546.

A note for Windows users: I recommend using BitmapFontBuilder, a free tool which will generate a texture and export character widths for you, you can find a tool for converting the binary output from this into ’glyph’ lines in the Tools folder.

Generating a font texture

You can also generate font textures on the fly using truetype fonts. I don’t recommend heavy use of this in production work because rendering the texture can take a several seconds per font which adds to the loading times. However it is a very nice way of quickly getting text output in a font of your choice.

Here are the attributes you need to supply:

Parameters
typetruetype Tells OGRE to generate the texture from a font
source<ttf file> The name of the ttf file to load. This will be searched for in the common resource locations.
size<size_in_points> The size at which to generate the font in points. This is the value that you would select in e.g. Word. This only affects how big the characters are in the font texture, not how big they are on the screen. You should tailor this depending on how large you expect to render the fonts because generating a large texture will result in blurry characters when they are scaled very small, and conversely generating a small font will result in blocky characters if large text is rendered.
resolution<dpi> The resolution in dots per inch, which is used in conjunction with the point size to determine the final texture size. Typical values are 72 / 96 dpi. This should match the dpi of the screen, given that the glyps occupy size points after projection (i.e. in screen-space).
antialias_colour<true|false> This is an optional flag, which defaults to false. The generator will antialias the font by default using the alpha component of the texture, which will look fine if you use alpha blending to render your text (this is the default assumed by TextAreaOverlayElement for example). If, however you wish to use a colour based blend like add or modulate in your own code, you should set this to true so the colour values are anti-aliased too. If you set this to true and use alpha blending, you’ll find the edges of your font are antialiased too quickly resulting in a thin look to your fonts, because not only is the alpha blending the edges, the colour is fading too. Leave this option at the default if in doubt.
code_pointsnn-nn [nn-nn] .. This directive allows you to specify which unicode code points should be generated as glyphs into the font texture. If you don’t specify this, code points 33-126 will be generated by default which covers the ASCII glyphs. If you use this flag, you should specify a space-separated list of inclusive code point ranges of the form ’start-end’. Numbers must be decimal.

You can also create new fonts at runtime by using the FontManager if you wish.