Most text characters are converted into binary using a code called ASCII (American Standard Code for Information Interchange). The first version of ASCII was published in 1963 and last updated in 1986.
ASCII contains 128 characters, which is enough for most English content. For example, the dollar sign ($) can be represented as binary code of 0100100 (decimal 36); as you can see, it is represented using seven bits of data.
ASCII, however, is not enough for everyone as we have many different languages that use different alphabets such as the Russian alphabet and Chinese characters. For that, a new code was created and it is called Unicode. This new standard assigns a unique number to every character in every language. Each unicode character can take up to four bytes.