# Quadratic irrational number

(Redirected from Quadratic irrational)
Jump to: navigation, search

In mathematics, a quadratic irrational number (also known as a quadratic irrational, a quadratic irrationality or quadratic surd) is an irrational number that is the solution to some quadratic equation with rational coefficients which is irreducible over the set of rational numbers.[1] Since fractions in the coefficients of a quadratic equation can be cleared by multiplying both sides by their common denominator, a quadratic irrational is an irrational root of some quadratic equation whose coefficients are integers. The quadratic irrational numbers, a subset of the complex numbers, are algebraic numbers of degree 2, and can therefore be expressed as

${\displaystyle {a+b{\sqrt {c}} \over d},}$

for integers a, b, c, d; with b, c and d non-zero, and with c square-free. When c is positive, we get real quadratic irrational numbers, while a negative c gives complex quadratic irrational numbers which are not real numbers. This implies that the quadratic irrationals have the same cardinality as ordered quadruples of integers, and are therefore countable.

Quadratic irrationals are used in field theory to construct field extensions of the rational field . Given the square-free integer c, the augmentation of by quadratic irrationals using c produces a quadratic field ℚ(c). For example, the inverses of elements of ℚ(c) are of the same form as the above algebraic numbers:

${\displaystyle {d \over a+b{\sqrt {c}}}={ad-bd{\sqrt {c}} \over a^{2}-b^{2}c}.\,}$

Quadratic irrationals have useful properties, especially in relation to continued fractions, where we have the result that all real quadratic irrationals, and only real quadratic irrationals, have periodic continued fraction forms. For example

${\displaystyle {\sqrt {3}}=1.732\ldots =[1;1,2,1,2,1,2,\ldots ]}$

## Square root of non-square is irrational

The definition of quadratic irrationals requires them to satisfy two conditions: they must satisfy a quadratic equation and they must be irrational. The solutions to the quadratic equation ax2 + bx + c = 0 are

${\displaystyle {\frac {-b\pm {\sqrt {b^{2}-4ac}}}{2a}}.}$

Thus quadratic irrationals are precisely those real numbers in this form that are not rational. Since b and 2a are both integers, asking when the above quantity is irrational is the same as asking when the square root of an integer is irrational. The answer to this is that the square root of any natural number that is not a square number is irrational.

The square root of 2 was the first such number to be proved irrational. Theodorus of Cyrene proved the irrationality of the square roots of whole numbers up to 17, but stopped there, probably because the algebra he used could not be applied to the square root of numbers greater than 17. Euclid's Elements Book 10 is dedicated to classification of irrational magnitudes. The original proof of the irrationality of the non-square natural numbers depends on Euclid's lemma.

Many proofs of the irrationality of the square roots of non-square natural numbers implicitly assume the fundamental theorem of arithmetic, which was first proven by Carl Friedrich Gauss in his Disquisitiones Arithmeticae. This asserts that every integer has a unique factorization into primes. For any rational non-integer in lowest terms there must be a prime in the denominator which does not divide into the numerator. When the numerator is squared that prime will still not divide into it because of the unique factorization. Therefore, the square of a rational non-integer is always a non-integer; by contrapositive, the square root of an integer is always either another integer, or irrational.

Euclid used a restricted version of the fundamental theorem and some careful argument to prove the theorem. His proof is in Euclid's Elements Book X Proposition 9.[2]

The fundamental theorem of arithmetic is not actually required to prove the result, however. There are self-contained proofs by Richard Dedekind,[3] among others. The following proof was adapted by Colin Richard Hughes from a proof of the irrationality of the square root of two found by Theodor Estermann in 1975.[4][5]

Assume D is a non-square natural number, then there is a number n such that:

n2 < D < (n + 1)2,

so in particular

0 < √Dn < 1.

Assume the square root of D is a rational number p/q, assume the q here is the smallest for which this is true, hence the smallest number for which qD is also an integer. Then:

(√Dn)qD = qDnqD

is also an integer. But 0 < (√D − n) < 1 so (√D − n)q < q. Hence (√D − n)q is an integer smaller than q such that (√D − n)qD is also an integer. This is a contradiction since q was defined to be the smallest number with this property; hence √D cannot be rational.

## References

1. ^ Jörn Steuding, Diophantine Analysis, (2005), Chapman & Hall, p.72.
2. ^ Euclid. "Euclid's Elements Book X Proposition 9". D.E.Joyce, Clark University. Retrieved 2008-10-29.
3. ^ A. Bogomolny. "Square root of 2 is irrational". Interactive Mathematics Miscellany and Puzzles. Retrieved May 5, 2016.
4. ^ Hughes, Colin Richard (1999). "Irrational roots". Mathematical Gazette. 83 (498): 502–503.
5. ^ Estermann, Theodor (1975). "The irrationality of √2". Mathematical Gazette. 59 (408): 110.