# Pollard's rho algorithm

Pollard's rho algorithm is an algorithm for integer factorization. It was invented by John Pollard in 1975.[1] It uses only a small amount of space, and its expected running time is proportional to the square root of the smallest prime factor of the composite number being factorized.

## Core ideas

The algorithm is used to factorize a number ${\displaystyle n=pq}$, where ${\displaystyle p}$ is a non-trivial factor. A polynomial modulo ${\displaystyle n}$, called ${\displaystyle g(x)}$ (e.g., ${\displaystyle g(x)=(x^{2}+1){\bmod {n}}}$), is used to generate a pseudorandom sequence. It is important to note that ${\displaystyle g(x)}$ must be a polynomial. A starting value, say 2, is chosen, and the sequence continues as ${\displaystyle x_{1}=g(2)}$, ${\displaystyle x_{2}=g(g(2))}$, ${\displaystyle x_{3}=g(g(g(2)))}$, etc. The sequence is related to another sequence ${\displaystyle \{x_{k}{\bmod {p}}\}}$. Since ${\displaystyle p}$ is not known beforehand, this sequence cannot be explicitly computed in the algorithm. Yet in it lies the core idea of the algorithm.

Because the number of possible values for these sequences is finite, both the ${\displaystyle \{x_{k}\}}$ sequence, which is mod ${\displaystyle n}$, and ${\displaystyle \{x_{k}{\bmod {p}}\}}$ sequence will eventually repeat, even though these values are unknown. If the sequences were to behave like random numbers, the birthday paradox implies that the number of ${\displaystyle x_{k}}$ before a repetition occurs would be expected to be ${\displaystyle O({\sqrt {N}})}$, where ${\displaystyle N}$ is the number of possible values. So the sequence ${\displaystyle \{x_{k}{\bmod {p}}\}}$ will likely repeat much earlier than the sequence ${\displaystyle \{x_{k}\}}$. When one has found a ${\displaystyle k_{1},k_{2}}$ such that ${\displaystyle x_{k_{1}}\neq x_{k_{2}}}$ but ${\displaystyle x_{k_{1}}\equiv x_{k_{2}}{\bmod {p}}}$, the number ${\displaystyle |x_{k_{1}}-x_{k_{2}}|}$ is a multiple of ${\displaystyle p}$, so ${\displaystyle p}$ has been found.

Once a sequence has a repeated value, the sequence will cycle, because each value depends only on the one before it. This structure of eventual cycling gives rise to the name "rho algorithm", owing to similarity to the shape of the Greek letter ρ when the values ${\displaystyle x_{1}{\bmod {p}}}$, ${\displaystyle x_{2}{\bmod {p}}}$, etc. are represented as nodes in a directed graph.

This is detected by Floyd's cycle-finding algorithm: two nodes ${\displaystyle i}$ and ${\displaystyle j}$ (i.e., ${\displaystyle x_{i}}$ and ${\displaystyle x_{j}}$) are kept. In each step, one moves to the next node in the sequence and the other moves forward by two nodes. After that, it is checked whether ${\displaystyle \gcd(x_{i}-x_{j},n)\neq 1}$. If it is not 1, then this implies that there is a repetition in the ${\displaystyle \{x_{k}{\bmod {p}}\}}$ sequence (i.e. ${\displaystyle x_{i}{\bmod {p}}=x_{j}{\bmod {p}})}$. This works because if the ${\displaystyle x_{i}}$ is the same as ${\displaystyle x_{j}}$, the difference between ${\displaystyle x_{i}}$ and ${\displaystyle x_{j}}$ is necessarily a multiple of ${\displaystyle p}$. Although this always happens eventually, the resulting greatest common divisor (GCD) is a divisor of ${\displaystyle n}$ other than 1. This may be ${\displaystyle n}$ itself, since the two sequences might repeat at the same time. In this (uncommon) case the algorithm fails, and can be repeated with a different parameter.

## Algorithm

The algorithm takes as its inputs n, the integer to be factored; and ${\displaystyle g(x)}$, a polynomial in x computed modulo n. In the original algorithm, ${\displaystyle g(x)=(x^{2}-1){\bmod {n}}}$, but nowadays it is more common to use ${\displaystyle g(x)=(x^{2}+1){\bmod {n}}}$. The output is either a non-trivial factor of n, or failure.

It performs the following steps:[2]

Pseudocode for Pollard's rho algorithm

    x ← 2 // starting value
y ← x
d ← 1

while d = 1:
x ← g(x)
y ← g(g(y))
d ← gcd(|x - y|, n)

if d = n:
return failure
else:
return d


Here x and y corresponds to ${\displaystyle x_{i}}$ and ${\displaystyle x_{j}}$ in the previous section. Note that this algorithm may fail to find a nontrivial factor even when n is composite. In that case, the method can be tried again, using a starting value of x other than 2 (${\displaystyle 0\leq x) or a different ${\displaystyle g(x)}$, ${\displaystyle g(x)=(x^{2}+b){\bmod {n}}}$, with ${\displaystyle 1\leq b.

## Example factorization

Let ${\displaystyle n=8051}$ and ${\displaystyle g(x)=(x^{2}+1){\bmod {8}}051}$.

i x y gcd(|xy|, 8051)
1 5 26 1
2 26 7474 1
3 677 871 97
4 7474 1481 1

Now 97 is a non-trivial factor of 8051. Starting values other than x = y = 2 may give the cofactor (83) instead of 97. One extra iteration is shown above to make it clear that y moves twice as fast as x. Note that even after a repetition, the GCD can return to 1.

## Variants

In 1980, Richard Brent published a faster variant of the rho algorithm. He used the same core ideas as Pollard but a different method of cycle detection, replacing Floyd's cycle-finding algorithm with the related Brent's cycle finding method.[3]

A further improvement was made by Pollard and Brent. They observed that if ${\displaystyle \gcd(a,n)>1}$, then also ${\displaystyle \gcd(ab,n)>1}$ for any positive integer ${\displaystyle b}$. In particular, instead of computing ${\displaystyle \gcd(|x-y|,n)}$ at every step, it suffices to define ${\displaystyle z}$ as the product of 100 consecutive ${\displaystyle |x-y|}$ terms modulo ${\displaystyle n}$, and then compute a single ${\displaystyle \gcd(z,n)}$. A major speed up results as 100 gcd steps are replaced with 99 multiplications modulo ${\displaystyle n}$ and a single gcd. Occasionally it may cause the algorithm to fail by introducing a repeated factor, for instance when ${\displaystyle n}$ is a square. But it then suffices to go back to the previous gcd term, where ${\displaystyle \gcd(z,n)=1}$, and use the regular ρ algorithm from there.

## Application

The algorithm is very fast for numbers with small factors, but slower in cases where all factors are large. The ρ algorithm's most remarkable success was the 1980 factorization of the Fermat number F8 = 1238926361552897 × 93461639715357977769163558199606896584051237541638188580280321.[4] The ρ algorithm was a good choice for F8 because the prime factor p = 1238926361552897 is much smaller than the other factor. The factorization took 2 hours on a UNIVAC 1100/42.[4]

## Example: factoring n = 10403 = 101 · 103

The following table shows numbers produced by the algorithm, starting with ${\displaystyle x=2}$ and using the polynomial ${\displaystyle g(x)=(x^{2}+1){\bmod {1}}0403}$. The third and fourth columns of the table contain additional information not known by the algorithm. They are included to show how the algorithm works.

${\displaystyle x}$ ${\displaystyle y}$ ${\displaystyle x{\bmod {1}}01}$ ${\displaystyle y{\bmod {1}}01}$ step
2 2 2 2 0
5 2 5 2 1
26 2 26 2 2
677 26 71 26 3
598 26 93 26 4
3903 26 65 26 5
3418 26 85 26 6
156 3418 55 85 7
3531 3418 97 85 8
5168 3418 17 85 9
3724 3418 88 85 10
978 3418 69 85 11
9812 3418 15 85 12
5983 3418 24 85 13
9970 3418 72 85 14
236 9970 34 72 15
3682 9970 46 72 16
2016 9970 97 72 17
7087 9970 17 72 18
10289 9970 88 72 19
2594 9970 69 72 20
8499 9970 15 72 21
4973 9970 24 72 22
2799 9970 72 72 23

The first repetition modulo 101 is 97 which occurs in step 17. The repetition is not detected until step 23, when ${\displaystyle x\equiv y{\pmod {101}}}$. This causes ${\displaystyle \gcd(x-y,n)=\gcd(2799-9970,n)}$ to be ${\displaystyle p=101}$, and a factor is found.

## Complexity

If the pseudorandom number ${\displaystyle x=g(x)}$ occurring in the Pollard ρ algorithm were an actual random number, it would follow that success would be achieved half the time, by the birthday paradox in ${\displaystyle O({\sqrt {p}})\leq O(n^{1/4})}$ iterations. It is believed that the same analysis applies as well to the actual rho algorithm, but this is a heuristic claim, and rigorous analysis of the algorithm remains open.[5]