**Computational Complexity**

The following property holds: after *h*_{2}-sorting of any *h*_{1}-sorted array, the array remains *h*_{1}-sorted. Every *h*_{1}-sorted and *h*_{2}-sorted array is also (*a*_{1}*h*_{1}+*a*_{2}*h*_{2})-sorted, for any nonnegative integers *a*_{1} and *a*_{2}. The worst-case complexity of Shellsort is therefore connected with the Frobenius problem: for given integers *h*_{1},..., *h*_{n} with gcd = 1, the Frobenius number *g*(*h*_{1},..., *h*_{n}) is the greatest integer that cannot be represented as *a*_{1}*h*_{1}+ ... +*a*_{n}*h*_{n} with nonnegative integer *a*_{1},..., *a*_{n}. Using known formulae for Frobenius numbers, we can determine the worst-case complexity of Shellsort for several classes of gap sequences. Proven results are shown in the above table.

With respect to the average number of operations, none of proven results concerns a practical gap sequence. For gaps that are powers of two, Espelid computed this average as . Knuth determined the average complexity of sorting an *N*-element array with two gaps (*h*, 1) to be . It follows that a two-pass Shellsort with *h* = Θ(*N*1/3) makes on average *O*(*N*5/3) comparisons. Yao found the average complexity of a three-pass Shellsort. His result was refined by Janson and Knuth: the average number of comparisons made during a Shellsort with three gaps (*ch*, *cg*, 1), where *h* and *g* are coprime, is in the first pass, in the second pass and in the third pass. *ψ*(*h*, *g*) in the last formula is a complicated function asymptotically equal to . In particular, when *h* = Θ(*N*7/15) and *g* = Θ(*h*1/5), the average time of sorting is *O*(*N*23/15).

Based on experiments, it is conjectured that Shellsort with Hibbard's and Knuth's gap sequences runs in *O*(*N*5/4) average time, and that Gonnet and Baeza-Yates's sequence requires on average 0.41*N*ln*N*(ln ln*N*+1/6) element moves. Approximations of the average number of operations formerly put forward for other sequences fail when sorted arrays contain millions of elements.

The graph below shows the average number of element comparisons in various variants of Shellsort, divided by the theoretical lower bound, i.e. log_{2}*N*!, where the sequence 1, 4, 10, 23, 57, 132, 301, 701 has been extended according to the formula .

Applying the theory of Kolmogorov complexity, Jiang, Li, and Vitányi proved the following lower bounds for the order of the average number of operations in an *m*-pass Shellsort: Ω(*mN*1+1/*m*) when *m*≤log_{2}*N* and Ω(*mN*) when *m*>log_{2}*N*. Therefore Shellsort has prospects of running in an average time that asymptotically grows like *N*log*N* only when using gap sequences whose number of gaps grows in proportion to the logarithm of the array size. It is, however, unknown whether Shellsort can reach this asymptotic order of average-case complexity, which is optimal for comparison sorts.

The worst-case complexity of any version of Shellsort is of higher order: Plaxton, Poonen, and Suel showed that it grows at least as rapidly as Ω(*N*(log*N*/log log*N*)2).

Read more about this topic: Shellsort

### Famous quotes containing the word complexity:

“The price we pay for the *complexity* of life is too high. When you think of all the effort you have to put in—telephonic, technological and relational—to alter even the slightest bit of behaviour in this strange world we call social life, you are left pining for the straightforwardness of primitive peoples and their physical work.”

—Jean Baudrillard (b. 1929)