History
The earliest form of regression was the method of least squares, which was published by Legendre in 1805, and by Gauss in 1809. Legendre and Gauss both applied the method to the problem of determining, from astronomical observations, the orbits of bodies about the Sun (mostly comets, but also later the then newly discovered minor planets). Gauss published a further development of the theory of least squares in 1821, including a version of the Gauss–Markov theorem.
The term "regression" was coined by Francis Galton in the nineteenth century to describe a biological phenomenon. The phenomenon was that the heights of descendants of tall ancestors tend to regress down towards a normal average (a phenomenon also known as regression toward the mean). For Galton, regression had only this biological meaning, but his work was later extended by Udny Yule and Karl Pearson to a more general statistical context. In the work of Yule and Pearson, the joint distribution of the response and explanatory variables is assumed to be Gaussian. This assumption was weakened by R.A. Fisher in his works of 1922 and 1925. Fisher assumed that the conditional distribution of the response variable is Gaussian, but the joint distribution need not be. In this respect, Fisher's assumption is closer to Gauss's formulation of 1821.
In the 1950s and 1960s, economists used electromechanical desk calculators to calculate regressions. Before 1970, it sometimes took up to 24 hours to receive the result from one regression.
Regression methods continue to be an area of active research. In recent decades, new methods have been developed for robust regression, regression involving correlated responses such as time series and growth curves, regression in which the predictor or response variables are curves, images, graphs, or other complex data objects, regression methods accommodating various types of missing data, nonparametric regression, Bayesian methods for regression, regression in which the predictor variables are measured with error, regression with more predictor variables than observations, and causal inference with regression.
Read more about this topic: Regression Analysis
Famous quotes containing the word history:
“No cause is left but the most ancient of all, the one, in fact, that from the beginning of our history has determined the very existence of politics, the cause of freedom versus tyranny.”
—Hannah Arendt (19061975)
“The history of reform is always identical; it is the comparison of the idea with the fact. Our modes of living are not agreeable to our imagination. We suspect they are unworthy. We arraign our daily employments.”
—Ralph Waldo Emerson (18031882)
“A poets object is not to tell what actually happened but what could or would happen either probably or inevitably.... For this reason poetry is something more scientific and serious than history, because poetry tends to give general truths while history gives particular facts.”
—Aristotle (384323 B.C.)