Entropy (information theory)

1878

Willard Gibbs in 1878 after earlier work by Boltzmann (1872). The Gibbs entropy translates over almost unchanged into the world of quantum physics to give the von Neumann entropy, introduced by John von Neumann in 1927, S = - k_\text{B} \,{\rm Tr}(\rho \ln \rho) \, where ρ is the density matrix of the quantum mechanical system and Tr is the trace. At an everyday practical level, the links between information entropy and thermodynamic entropy are not evident.

1927

Willard Gibbs in 1878 after earlier work by Boltzmann (1872). The Gibbs entropy translates over almost unchanged into the world of quantum physics to give the von Neumann entropy, introduced by John von Neumann in 1927, S = - k_\text{B} \,{\rm Tr}(\rho \ln \rho) \, where ρ is the density matrix of the quantum mechanical system and Tr is the trace. At an everyday practical level, the links between information entropy and thermodynamic entropy are not evident.

1948

The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", and is sometimes called Shannon entropy in his honour.

1961

Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total thermodynamic entropy does not decrease (which resolves the paradox).

1986

The authors estimate humankind technological capacity to store information (fully entropically compressed) in 1986 and again in 2007.

2007

(See also Kolmogorov complexity.) In practice, compression algorithms deliberately include some judicious redundancy in the form of checksums to protect against errors. A 2011 study in Science estimates the world's technological capacity to store and communicate optimally compressed information normalized on the most effective compression algorithms available in the year 2007, therefore estimating the entropy of the technologically available sources.

The authors estimate humankind technological capacity to store information (fully entropically compressed) in 1986 and again in 2007.

2011

(See also Kolmogorov complexity.) In practice, compression algorithms deliberately include some judicious redundancy in the form of checksums to protect against errors. A 2011 study in Science estimates the world's technological capacity to store and communicate optimally compressed information normalized on the most effective compression algorithms available in the year 2007, therefore estimating the entropy of the technologically available sources.




All text is taken from Wikipedia. Text is available under the Creative Commons Attribution-ShareAlike License .

Page generated on 2021-08-05