**Entropy In Thermodynamics And Information Theory**

There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by *S*, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as *H*, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in *A Mathematical Theory of Communication*.

This article explores what links there are between the two concepts, and how far they can be regarded as connected.

Read more about Entropy In Thermodynamics And Information Theory: Theoretical Relationship, Negentropy, Black Holes, Quantum Theory, The Fluctuation Theorem

### Other articles related to "entropy in thermodynamics and information theory, information":

**Entropy In Thermodynamics And Information Theory**- Topics of Recent Research - Is Information Quantized?

... Dr Tim Palmer signalled two unwritten assumptions about Shannon's definition of

**information**that may make it inapplicable as such to quantum mechanics The ... quantization observed in QM could be bound to

**information**quantization (one cannot observe less than one bit, and what is not observed is by definition "random") ...

### Famous quotes containing the words theory, information and/or entropy:

“The *theory* [before the twentieth century] ... was that all the jobs in the world belonged by right to men, and that only men were by nature entitled to wages. If a woman earned money, outside domestic service, it was because some misfortune had deprived her of masculine protection.”

—Rheta Childe Dorr (1866–1948)

“The real, then, is that which, sooner or later, *information* and reasoning would finally result in, and which is therefore independent of the vagaries of me and you. Thus, the very origin of the conception of reality shows that this conception essentially involves the notion of a COMMUNITY, without definite limits, and capable of a definite increase of knowledge.”

—Charles Sanders Peirce (1839–1914)

“Just as the constant increase of *entropy* is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against *entropy*.”

—Václav Havel (b. 1936)