Around 1960, Ray Solomonoff founded the theory of universal inductive inference, the theory of prediction based on observations; for example, predicting the next symbol based upon a given series of symbols. The only assumption is that the environment follows some unknown but computable probability distribution.
It achieves excellent theoretical results and is based on solid philosophical foundations: it is a mathematically formalized Occam's razor: shorter computable theories have more weight when calculating the probability of the next observation, using all computable theories which perfectly describe previous observations. Marcus Hutter's universal artificial intelligence builds upon this to calculate the expected value of an action.
The proof of the "razor" is based on the known mathematical properties of a probability distribution over a denumerable set (we can use these properties because the infinite set of all programs is a denumerable): the sum S of the probabilities of all programs must be exactly equal to one (as per the definition of probability) thus the probabilities must decrease as we enumerate the infinite set of all programs, otherwise S will be strictly greater than one.
Fundamental ingredients of the theory are the concepts of algorithmic probability and Kolmogorov complexity. The universal prior probability of any prefix p of a computable sequence x is the sum of the probabilities of all programs (for a universal computer) that compute something starting with p. Given some p and any computable but unknown probability distribution from which x is sampled, the universal prior and Bayes' theorem can be used to predict the yet unseen parts of x in optimal fashion.
Though Solomonoff's inductive inference is not computable, several AIXI-derived algorithms approximate it in order to make it run on a modern computer. The more they are given computing power, the more their predictions are close to the predictions of inductive inference (their mathematical limit is Solomonoff's inductive inference).
Another direction of inductive inference is based on E. Mark Gold's model of learning in the limit from 1967 and has developed since then more and more models of learning. The general scenario is the following: Given a class S of computable functions, is there a learner (that is, recursive functional) which for any input of the form (f(0),f(1),...,f(n)) outputs a hypothesis (an index e with respect to a previously agreed on acceptable numbering of all computable functions; the indexed function should be consistent with the given values of f). A learner M learns a function f if almost all its hypotheses are the same index e, which generates the function f; M learns S if M learns every f in S. Basic results are that all recursively enumerable classes of functions are learnable while the class REC of all computable functions is not learnable. Many related models have been considered and also the learning of classes of recursively enumerable sets from positive data is a topic studied from Gold's pioneering paper in 1967 onwards. A far reaching extension of the Gold’s approach is developed by Schmidhuber's theory of generalized Kolmogorov complexities, which are kinds of super-recursive algorithms.
Other articles related to "inductive inference, inductive, inference":
... Around 1960, Ray Solomonoff founded the theory of universal inductive inference, the theory of prediction based on observations for example, predicting the next symbol based upon a given series of symbols ... This is a formal inductive framework that combines algorithmic information theory with the Bayesian framework ... Universal inductive inference is based on solid philosophical foundations and can be considered as a mathematically formalized Occam's razor ...
... Boolean match partial match, best match partial match, best match Inference deductive inference inductive inference deductive inference, inductive inference ... and information retrieval, they use different inference models, retrieval methods, result organization, etc ... From an inference perspective, data retrieval uses deductive inference, and information retrieval uses inductive inference ...
... In mid 1960s, E Mark Gold and Hilary Putnam independently proposed models of inductive inference (the "limiting recursive functionals" and "trial-and-erro ... limiting identification might be regarded as higher-order inductive inference performed collectively by an ever-growing community of lower order inductive inference machines." A real ...
Famous quotes containing the word inference:
“I shouldnt want you to be surprised, or to draw any particular inference from my making speeches, or not making speeches, out there. I dont recall any candidate for President that ever injured himself very much by not talking.”
—Calvin Coolidge (18721933)