Entropy In Thermodynamics And Information Theory
There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in A Mathematical Theory of Communication.
This article explores what links there are between the two concepts, and how far they can be regarded as connected.
Read more about Entropy In Thermodynamics And Information Theory: Theoretical Relationship, Negentropy, Black Holes, Quantum Theory, The Fluctuation Theorem
Famous quotes containing the words entropy, information and/or theory:
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)
“As information technology restructures the work situation, it abstracts thought from action.”
—Shoshana Zuboff (b. 1951)
“It is not enough for theory to describe and analyse, it must itself be an event in the universe it describes. In order to do this theory must partake of and become the acceleration of this logic. It must tear itself from all referents and take pride only in the future. Theory must operate on time at the cost of a deliberate distortion of present reality.”
—Jean Baudrillard (b. 1929)