# Streaming Algorithm - Some Streaming Problems - Entropy

Entropy

The (empirical) entropy of a set of frequencies is defined as $F_k(mathbf{a}) = sum_{i=1}^n frac{a_i}{m}log{frac{a_i}{m}}$, where .

Estimation of this quantity in a stream has been done by:

• McGregor et al.
• Do Ba et al.
• Lall et al.
• Chakrabarti et al.

### Other articles related to "entropy":

Shrinkage Estimator
... Despite its simplicity...it outperforms eight other entropy estimation procedures across a diverse range of sampling scenarios and data-generating ... Moreover...procedure simultaneously provides estimates of the entropy and of the cell frequencies ... The proposed shrinkage estimators of entropy and mutual information, as well as all other investigated entropy estimators, have been implemented in R (R Development Core Team ...
Configuration Entropy
... In statistical mechanics, configuration entropy is the portion of a system's entropy that is related to the position of its constituent particles rather than to ... The configurational entropy is also known as microscopic entropy or conformational entropy in the study of macromolecules ... In general, configurational entropy is the foundation of statistical thermodynamics ...
Interdisciplinary Applications of Entropy - Science Fiction About Entropy
... the ability of humankind to cope with and potentially reverse the process of entropy ... a super-computer named AC(Multivacs) again and again" How can the net amount of entropy of the universe be massively decreased?" Each time AC's answer is"INSUFF ...
Configuration Entropy - Calculation
... The configurational entropy is related to the number of possible configurations by Boltzmann's entropy formula where kB is the Boltzmann constant and W is the number of possible ... if a system can be in states n with probabilities Pn, the configurational entropy of the system is given by which in the perfect disorder limit (all Pn = 1/W) leads to ... to that of Shannon's information entropy ...
Logrithm - Applications - Entropy and Chaos
... Entropy is broadly a measure of the disorder of some system ... In statistical thermodynamics, the entropy S of some physical system is defined as The sum is over all possible states i of the system in question ... Similarly, entropy in information theory measures the quantity of information ...

### Famous quotes containing the word entropy:

Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.
Václav Havel (b. 1936)