The Garden

A blog by Marijn van Hoorn

Backslashes aren’t real (and other Ascii weirdness)

Marijn van Hoorn

You heard that right. \, the backslash character, has no basis in any historical punctuation: computer scientist Bob Bemer added it to Ascii, the original text encoding standard for computers and whence “Ascii art”, in 11961. The backslash was intended so that the symbols and , used in the Algol programming language to represent “and” and “or” respectively, could be represented in Ascii as /\ and \/, saving space for a character.α

The caret, ^, has a similar story. These days it’s most often used to represent mathematical exponents (e.g. “2^3” standing in for 232^3), to type in the circumflex accent (e.g. on my keyboard, typing ^a will result in an output of â), or to make corrections in proofreading (its original usage). In the original version of Ascii, however, the character in its position was the upwards arrow , used in Algol to represent exponents. It was later swapped out to the visually similar caret, letting it keep its original use while also allowing it to be overtyped over characters for languages with circumflex accents.

The underscore, _, developed in typewriters as a way to underline words as a substitute for italic fonts. This was a convention developed by proofreaders; an underline under a word or phrase signalled that it should be set in italics.

Finally, the curious case of ¤, the little-used generic currency symbol. The Italian delegation to the International Tele­com­mu­ni­ca­tion Union proposed it as an alternative to the dollar $ and pound £ signs, but most nations regarded the dollar as too important and chose not to adopt it.β I could find very little information on the origins of the symbol: whether it was preëxisting somewhere or was invented out of whole cloth, and if it was invented out of whole cloth, where the symbol was derived from. It’s all very curious.