In algorithmic information theory (a subfield of computer science), the **Kolmogorov complexity** of an object, such as a piece of text, is a measure of the computational resources needed to specify the object. It is named after the Andrey Kolmogorov who first published on the subject in 1963.

Kolmogorov complexity is also known as "descriptive complexity" (not to be confused with descriptive complexity theory), **Kolmogorov–Chaitin complexity**, **algorithmic entropy**, or **program-size complexity**.

For example, consider the following two strings of length 64, each containing only lowercase letters and digits:

abababababababababababababababababababababababababababababababab 4c1j5b2p0cv4w1x8rx2y39umgw5q85s7traquuxdppa0q7nieieqe9noc4cvafzfThe first string has a short English-language description, namely "ab 32 times", which consists of **11** characters. The second one has no obvious simple description (using the same character set) other than writing down the string itself, which has **64** characters.

More formally, the complexity of a string is the length of the shortest possible description of the string in some fixed universal description language (the sensitivity of complexity relative to the choice of description language is discussed below). It can be shown that the Kolmogorov complexity of any string cannot be more than a few bytes larger than the length of the string itself. Strings whose Kolmogorov complexity is small relative to the string's size are not considered to be complex.

The notion of the Kolmogorov complexity can be used to state and prove impossibility results akin to Gödel's incompleteness theorem and Turing's halting problem.

Read more about Kolmogorov Complexity: Definition, History and Context, Basic Results, Compression, Chaitin's Incompleteness Theorem, Minimum Message Length, Kolmogorov Randomness, Relation To Entropy

### Other articles related to "kolmogorov complexity, kolmogorov, complexity":

**Kolmogorov Complexity**

... The field of

**Kolmogorov complexity**and algorithmic randomness was developed during the 1960s and 1970s by Chaitin,

**Kolmogorov**, Levin, Martin-Löf and ... The main idea is to consider a universal Turing machine U and to measure the

**complexity**of a number (or string) x as the length of the shortest input p such that U(p) outputs x ...

**Kolmogorov complexity**became not only a subject of independent study but is also applied to other subjects as a tool for obtaining proofs ...

...

**Kolmogorov**'s structure function becomes where x is a binary string of length n with where P is a contemplated model (computable probability of n-length strings) for x, is the ... and reaches for where c is the required number of bits to change x into and is the

**Kolmogorov complexity**of x ... For every

**complexity**level the function is the

**Kolmogorov complexity**version of the maximum likelihood (ML) ...

**Kolmogorov Complexity**- Relation To Entropy

... be shown that for the output of Markov information sources,

**Kolmogorov complexity**is related to the entropy of the information source ... More precisely, the

**Kolmogorov complexity**of the output of a Markov information source, normalized by the length of the output, converges almost surely (as the length of the output goes to infinity) to the ...

... Leonid Levin and Claus-Peter Schnorr proved a characterization in terms of

**Kolmogorov complexity**a sequence is random if there is a uniform bound on the compressibility of its initial segments ... Li and Vitanyi's book An Introduction to

**Kolmogorov Complexity**and Its Applications is the standard introduction to these ideas ...

**Kolmogorov complexity**(Schnorr 1973, Levin 1973)

**Kolmogorov complexity**can be thought of as a lower bound on the algorithmic compressibility of a finite ...

**Kolmogorov Complexity**

... The chain rule for

**Kolmogorov complexity**is an analogue of the chain rule for information entropy, which states That is, the combined randomness of two sequences X and Y is the sum of the randomness ... probability The equivalent statement for

**Kolmogorov complexity**does not hold exactly it is true only up to a logarithmic term (An exact version, KP(x, y) = KP(x ... an analogue of mutual information for

**Kolmogorov complexity**is symmetric I(xy) = I(yx) + O(log K(x,y)) for all x,y ...

### Famous quotes containing the word complexity:

“The price we pay for the *complexity* of life is too high. When you think of all the effort you have to put in—telephonic, technological and relational—to alter even the slightest bit of behaviour in this strange world we call social life, you are left pining for the straightforwardness of primitive peoples and their physical work.”

—Jean Baudrillard (b. 1929)