# Slowsort

Slowsort is a sorting algorithm. It is of humorous nature and not useful. It is a reluctant algorithm based on the principle of multiply and surrender (a parody formed by taking the opposites of divide and conquer). It was published in 1986 by Andrei Broder and Jorge Stolfi in their paper Pessimal Algorithms and Simplexity Analysis[1] (a parody of optimal algorithms and complexity analysis).

## Algorithm

Slowsort is a recursive algorithm.

This is an implementation in pseudocode:

procedure slowsort(A[], i, j)          // Sort array range A[i ... j] in-place.
if i ≥ j then
return
m := floor( (i+j)/2 )
slowsort(A, i, m)                  // (1.1)
slowsort(A, m+1, j)                // (1.2)
if A[j] < A[m] then
swap A[j] , A[m]               // (1.3)
slowsort(A, i, j-1)                // (2)

• Sort the first half, recursively. (1.1)
• Sort the second half, recursively. (1.2)
• Find the maximum of the whole array by comparing the results of 1.1 and 1.2, and place it at the end of the list. (1.3)
• Sort the entire list (except for the maximum now at the end), recursively. (2)

An unoptimized implementation in Haskell (purely functional) may look as follows:

slowsort :: (Ord a) => [a] -> [a]
slowsort xs
| length xs <= 1 = xs
| otherwise      = slowsort xs' ++ [max llast rlast]  -- (2)
where m     = length xs div 2
l     = slowsort $take m xs -- (1.1) r = slowsort$ drop m xs  -- (1.2)
llast = last l
rlast = last r
xs'   = init l ++ min llast rlast : init r


## Complexity

The runtime ${\displaystyle T(n)}$ for Slowsort is ${\displaystyle T(n)=2T(n/2)+T(n-1)+1}$.

A lower asymptotic bound for ${\displaystyle T(n)}$ in Landau notation is ${\displaystyle \Omega \left(n^{\frac {\log _{2}(n)}{(2+\epsilon )}}\right)}$ for any ${\displaystyle \epsilon >0}$.

Slowsort is therefore not in polynomial time. Even the best case is worse than Bubble sort.