In algorithmic information theory, **algorithmic (Solomonoff) probability** is a method of assigning a probability to each hypothesis (algorithm/program) that explains a given observation, with the simplest hypothesis (the shortest program) having the highest probability and the increasingly complex hypotheses (longer programs) receiving increasingly small probabilities. These probabilities form *a priori* a probability distribution for the observation, which Ray Solomonoff proved to be machine-invariant (called the invariance theorem) and can be used with Bayes' theorem to predict the most likely continuation of that observation. A theoretic computer, the universal Turing machine, is used for the computer operations.

Solomonoff invented the concept of algorithmic probability with its associated invariance theorem around 1960. He first published his results at a conference at Caltech in 1960, and in a report, Feb. 1960, "A Preliminary Report on a General Theory of Inductive Inference." He clarified these ideas more fully in 1964 with "A Formal Theory of Inductive Inference," Part I and Part II.

The algorithmic probability of any given finite output prefix *q* is the sum of the probabilities of the programs that compute something starting with *q*. Certain long objects with short programs have high probability.

Algorithmic probability is the main ingredient of Solomonoff's theory of inductive inference, the theory of prediction based on observations, it was invented with the goal of using it for machine learning; given a sequence of symbols, which one will come next? Solomonoff's theory provides an answer that is optimal in a certain sense, although it is incomputable. Unlike, for example, Karl Popper's informal inductive inference theory, however, Solomonoff's is mathematically rigorous.

Algorithmic probability is closely related to the concept of Kolmogorov complexity. Kolmogorov complexity, however, focuses on the information content of a string while algorithmic probability focuses on the predictive power of a string.

Solomonoff's enumerable measure is universal in a certain powerful sense, but the computation time can be infinite. To deal with this Solomonoff uses a variant of Leonid Levin's Search Algorithm, which limits the time spent computing the success of possible programs, with shorter programs given more time. Other methods of limiting search space used by Solomonoff include training sequences.

### Other articles related to "algorithmic, algorithmic probability, probability":

... Originally

**algorithmic**induction methods extrapolated ordered sequences of strings ...

**Algorithmic Probability**and Solomonoff Induction have many advantages for Artificial Intelligence ...

**Algorithmic Probability**gives extremely accurate

**probability**estimates ...

### Famous quotes containing the word probability:

“Legends of prediction are common throughout the whole Household of Man. Gods speak, spirits speak, computers speak. Oracular ambiguity or statistical *probability* provides loopholes, and discrepancies are expunged by Faith.”

—Ursula K. Le Guin (b. 1929)