The encoding of text by bits was also used in Morse code (1844) and early digital communications machines such as teletypes and stock ticker machines (1870). Ralph Hartley suggested the use of a logarithmic measure of information in 1928.
Vannevar Bush had written in 1936 of "bits of information" that could be stored on the punched cards used in the mechanical computers of that time.
Tukey, who had written a Bell Labs memo on 9 January 1947 in which he contracted "binary information digit" to simply "bit".
Shannon first used the word "bit" in his seminal 1948 paper "A Mathematical Theory of Communication".
The most common is the unit byte, coined by Werner Buchholz in June 1956, which historically was used to represent the group of bits used to encode a single character of text (until UTF-8 multibyte encoding took over) in a computer and for this reason it was used as the basic addressable element in many computer architectures.
The same principle was later used in the magnetic bubble memory developed in the 1980s, and is still found in various magnetic strip items such as metro tickets and some credit cards. In modern semiconductor memory, such as dynamic random-access memory, the two values of a bit may be represented by two levels of electric charge stored in a capacitor.
All text is taken from Wikipedia. Text is available under the Creative Commons Attribution-ShareAlike License .
Page generated on 2021-08-05