**Purpose of Cross Validation**

Suppose we have a model with one or more unknown parameters, and a data set to which the model can be fit (the training data set). The fitting process optimizes the model parameters to make the model fit the training data as well as possible. If we then take an independent sample of validation data from the same population as the training data, it will generally turn out that the model does not fit the validation data as well as it fits the training data. This is called overfitting, and is particularly likely to happen when the size of the training data set is small, or when the number of parameters in the model is large. Cross-validation is a way to predict the fit of a model to a hypothetical validation set when an explicit validation set is not available.

Linear regression provides a simple illustration of overfitting. In linear regression we have real *response values* *Y*_{1}, ..., *Y _{n}*, and vector

*covariates*

*X*

_{1}, ...,

*X*. We can use least squares to fit a hyperplane

_{p}*a*+

*b*

_{1}

*X*

_{1}+ ... +

*b*

_{p}

*X*

_{p}between the

*Y*and

*X*data, and then assess the fit using the mean squared error (MSE)

where *X _{ji}* is the value of variable

*X*corresponding to the

_{j}*i*th response value

*Y*.

_{i}It can be shown under mild assumptions that the expected value of the MSE for the training set is (*n* − *p* − 1)/(*n* + *p* + 1) < 1 times the expected value of the MSE for the validation set (the expected value is taken over the distribution of training sets). Thus if we fit the model and compute the MSE on the training set, we will get an optimistically biased assessment of how well the model will fit an independent data set. This biased estimate is called the *in-sample* estimate of the fit, whereas the cross-validation estimate is an *out-of-sample* estimate.

Since in linear regression it is possible to directly compute the factor (*n* − *p* − 1)/(*n* + *p* + 1) by which the training MSE underestimates the validation MSE, cross-validation is not practically useful in that setting. However in most other regression procedures (e.g. logistic regression), there is no simple formula to make this adjustment. Cross-validation is a generally applicable way to predict the performance of a model on a validation set using computation in place of mathematical analysis.

Read more about this topic: Cross-validation (statistics)

### Famous quotes containing the words purpose of, purpose and/or cross:

“And the *purpose of* the many stops and starts will be made clear:

Backing into the old affair of not wanting to grow

Into the night, which becomes a house, a parting of the ways

Taking us far into sleep. A dumb love.”

—John Ashbery (b. 1927)

“In those days, when my hands were much employed, I read but little, but the least scraps of paper which lay on the ground, my holder, or tablecloth, afforded me as much entertainment, in fact answered the same *purpose* as the Iliad.”

—Henry David Thoreau (1817–1862)

“...I learned in the early part of my career that labor must bear the *cross* for others’ sins, must be the vicarious sufferer for the wrongs that others do.”

—Mother Jones (1830–1930)