User:Smbody/Sand
Some practical results[edit]
Here you'll find some numerical results obtained from practical implementations of AKS algorithm. Two modifications of the AKS algorithm were used:
Original[edit]
This version can be referred to as one of the original versions of AKS. It is as follows:
- r = 2
- While (r < n)
- {
- if (gcd(r, n) ≠ 1) output composite;
- if (r is prime, r > 2)
- q = greatest prime divisor of (r - 1);
- if ( (q ≥ 4) and (n(r - 1)/q ≠ 1 (mod r)) ) break;
- r = r + 1;
- }
- For a = 1 to
- if (x - a)n ≡ xn - a (mod xr − 1,n), output composite;
- Output prime;
Optimization[edit]
This is Lenstra/Pomerance variant of improved AKS:
- Find the smallest r such that Or(n) > log2(n)
- For a = 1 to r
- if(gcd(a, n) ≠ 1) output composite;
- For a = 1 to
- if (x + a)n ≡ xn+a (mod xr − 1,n), output prime;
Implementation details[edit]
From results of testing we learn that the step of raising polynomials to the n'th power is the most time consuming(>99.9%), so the implementation of operations with polynomials is the most important part. The implementation in question was written in java, each polynomial being presented as a TreeMap with BigInteger coefficients and size. Amount of operations required to calculate one iteration of the last for can be estimated as follows: log(n) * r2 * log2(n)=r2log3(n), for the number of polynomial multiplications, the number of operations to multiply two polynomials of r order, and number of operations to multiply two BigIntegers, respectively.
Here is the dependency t(n) for the first 1000 numbers in logarithmic scale:
As you can see, in this range time is not polinomial, but exponential. It is so, possible, because of r being of the order of n, not log(n), while for big n r becomes O(log6(n)), as shown in theory. Nevertheless, this result can show the order of time needed to use AKS algorithm and hence the the range of its usability for now.