Quadratic probing is an open addressing scheme in computer programming for resolving hash collisions in hash tables. Quadratic probing operates by taking the original hash index and adding successive values of an arbitrary quadratic polynomial until an open slot is found.

An example sequence using quadratic probing is:

$H+1^{2},H+2^{2},H+3^{2},H+4^{2},...,H+k^{2}$ Quadratic probing can be a more efficient algorithm in an open addressing table, since it better avoids the clustering problem that can occur with linear probing, although it is not immune. It also provides good memory caching because it preserves some locality of reference; however, linear probing has greater locality and, thus, better cache performance.[dubious ][citation needed]

Let h(k) be a hash function that maps an element k to an integer in [0, m−1], where m is the size of the table. Let the ith probe position for a value k be given by the function

$h(k,i)=h(k)+c_{1}i+c_{2}i^{2}{\pmod {m}}$ where c2 ≠ 0. (If c2 = 0, then h(k,i) degrades to a linear probe. For a given hash table, the values of c1 and c2 remain constant.

Examples:

• If $h(k,i)=(h(k)+i+i^{2}){\pmod {m}}$ , then the probe sequence will be $h(k),h(k)+2,h(k)+6,...$ • For m = 2n, a good choice for the constants are c1 = c2 = 1/2, as the values of h(k,i) for i in [0, m−1] are all distinct. This leads to a probe sequence of $h(k),h(k)+1,h(k)+3,h(k)+6,...$ (the triangular numbers) where the values increase by 1, 2, 3, ...
• For prime m > 2, most choices of c1 and c2 will make h(k,i) distinct for i in [0, (m−1)/2]. Such choices include c1 = c2 = 1/2, c1 = c2 = 1, and c1 = 0, c2 = 1. However, there are only m/2 distinct probes for a given element, requiring other techniques to guarantee that insertions will succeed when the load factor is exceeds 1/2.

## Limitations

When using quadratic probing, however (with the exception of triangular number cases for a hash table of size $2^{n}$ ), there is no guarantee of finding an empty cell once the table becomes more than half full, or even before this if the table size is composite, because collisions must be resolved using half of the table at most.

The inverse of this can be proven as such: Suppose a hash table has size $p$ (a prime greater than 3), with an initial location $h(k)$ and two alternative locations $h(k)+x^{2}{\pmod {p}}$ and $h(k)+y^{2}{\pmod {p}}$ (where $0\leq x$ and $y\leq p/2$ ). If these two locations point to the same key space, but $x\neq y$ , then

$h(k)+x^{2}=h(k)+y^{2}{\pmod {p}}$ $x^{2}=y^{2}{\pmod {p}}$ $x^{2}-y^{2}=0{\pmod {p}}$ $(x-y)(x+y)=0{\pmod {p}}$ .

As $p$ is a prime number, either $(x-y)$ or $(x+y)$ must be divisible by $p$ . Since $x$ and $y$ are different (modulo $p$ ), $(x-y)\neq 0$ , and since both variables are greater than zero, $(x+y)\neq 0$ . Thus, by contradiction, the first $p/2$ alternative locations after $h(k)$ must be unique, and subsequently, an empty space can always be found so long as at most $p/2$ locations are filled (i.e., the hash table is not more than half full).

### Alternating signs

If the sign of the offset is alternated (e.g. +1, −4, +9, −16, etc.), and if the number of buckets is a prime number $p$ congruent to 3 modulo 4 (e.g. 3, 7, 11, 19, 23, 31, etc.), then the first $p$ offsets will be unique (modulo $p$ ).[further explanation needed] In other words, a permutation of 0 through $p-1$ is obtained, and, consequently, a free bucket will always be found as long as at least one exists.