Entropy is a thermodynamic property that is the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work. Perhaps the most familiar manifestation of entropy is that, following the laws of thermodynamics, entropy of a closed system always increases and in heat transfer situations, heat energy is transferred from higher temperature components to lower temperature components. In thermally isolated systems, entropy runs in one direction only (it is not a reversible process). One can measure the entropy of a system to determine the energy not available for work in a thermodynamic process, such as energy conversion, engines, or machines. Such processes and devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work, entropy accumulates in the system, which then dissipates in the form of waste heat.
In classical thermodynamics, the concept of entropy is defined phenomenologically by the second law of thermodynamics, which states that the entropy of an isolated system always increases or remains constant. Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature, in the form of heat. These processes reduce the state of order of the initial systems, and therefore entropy is an expression of disorder or randomness. This is the basis of the modern microscopic interpretation of entropy in statistical mechanics, where entropy is defined as the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. The second law is then a consequence of this definition and the fundamental postulate of statistical mechanics.
Thermodynamic entropy has the dimension of energy divided by temperature, which has a unit of joules per kelvin (J/K) in the International System of Units.
The term entropy was coined in 1865 by Rudolf Clausius based on the Greek εντροπία, a turning toward, from εν- (in) and τροπή (turn, conversion).
Read more about Entropy: Thermodynamical and Statistical Descriptions, Second Law of Thermodynamics, Definitions and Descriptions, History, Interdisciplinary Applications of Entropy
Other articles related to "entropy":
... In statistical mechanics, configuration entropy is the portion of a system's entropy that is related to the position of its constituent particles rather than to their velocity or momentum ... The configurational entropy is also known as microscopic entropy or conformational entropy in the study of macromolecules ... In general, configurational entropy is the foundation of statistical thermodynamics ...
... Asimov about the ability of humankind to cope with and potentially reverse the process of entropy ... named AC(Multivacs) again and again" How can the net amount of entropy of the universe be massively decreased?" Each time AC's answer is"INSUFFICIENT DATA FOR A MEANINGFUL ANSWER." In the last scene ... discovers the answer and decides to show it by demonstrating the reversal of entropy "And AC said"LET THERE BE LIGHT!" And there was light." ...
... Entropy is broadly a measure of the disorder of some system ... In statistical thermodynamics, the entropy S of some physical system is defined as The sum is over all possible states i of the system in question, such as the positions of gas particles in a container ... Similarly, entropy in information theory measures the quantity of information ...
... The configurational entropy is related to the number of possible configurations by Boltzmann's entropy formula where kB is the Boltzmann constant and ... system can be in states n with probabilities Pn, the configurational entropy of the system is given by which in the perfect disorder limit (all Pn = 1/W) leads to Boltzmann's formula, while in ... analogous to that of Shannon's information entropy ...
... The (empirical) entropy of a set of frequencies is defined as , where. ...
Famous quotes containing the word entropy:
“Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”
—Václav Havel (b. 1936)