Every letter is a number



In computers, letters are represented as numbers. To encode them, every letter is assigned its own decimal number between 0 and 127. The "H", for example, is encoded as 72. The string "Hello world" is stored as a sequence of numbers.

 H  e   l   l   o  space  w   o   r   l   d
72 101 108 108 111   32  119 111 114 108 100

As everything is stored in binary in computers, those numbers are translated in binary code. Not only are the capital letters are encoded in this way, but also the lowercase letters, the digits 0-9 and punctuation symbols.

This very common character encoding is called ASCII (American Standard Code for Information Interchange). Today, the international standard for encoding is called Unicode. It is way bigger than ASCII - big enough that every possible character of every language may have its own digital code.