A very long time ago, I wrote a program to take an anti-aliased font from RISC OS, and convert it into a particular bitmap format for a game I never got around to doing much with (other than plotting fonts). That game was called WarZone.
The most important thing was that the code took the anti-aliased bitmap, and converted it into a 16-greyscale format, which was pretty much what I needed for the XilVGA project.
I found the code, and had a look at it. It took the 95 main characters of the ASCII set, and created a 4-bpp bitmap from it, using the smallest required rectangle. This was perfect - although it didn't compact the 4-bpp pixels into two pixels per byte, which the XilVGA code was expecting.
So, after a period of time trying to remember the keyboard shortcuts to !Zap (it's been such a long time), I had made the necessary modifications, and produced three font sizes.
The output files were converted to C, and then imported into the XilVGA PIC project.
After a while of doing a few incorrect calculations, I ended up with something like this:
Unfortunately, I'd got the nibble ordering wrong on the initial calculation, and seeing that I'd run out of power cables and the RiscPC's power supply was powering the XilVGA board, I decided to just swap the nibbles in software.
With the addition of a correct palette, it looked much better:
I then tried the other two font sizes:
I didn't really like the width of the space, but before changing that, I removed a lot of the test code, and got it to display a warning I never want to see:
And then I adjusted the space width (by directly editing the binary data).
Perfect. One issue is that the fonts are only using a 2bpp palette, even though they're 4bpp in size. This means I can regenerate the fonts using 2bpp, which means the memory footprint will shrink (it's currently at about 60% of the PIC's memory if I have all three fonts embedded in it).
|© Copyright 1997-2018|
Tribbeck.com / Jason Tribbeck
All trademarks are the property of their respective owners.