User:Smbody/Sand

From Wikipedia, the free encyclopedia

Some practical results[edit]

Here you'll find some numerical results obtained from practical implementations of AKS algorithm. Two modifications of the AKS algorithm were used:

Original[edit]

This version can be referred to as one of the original versions of AKS. It is as follows:

  1. r = 2
  2. While (r < n)
    {
    if (gcd(r, n) ≠ 1) output composite;
    if (r is prime, r > 2)
    q = greatest prime divisor of (r - 1);
    if ( (q ≥ 4) and (n(r - 1)/q ≠ 1 (mod r)) ) break;
    r = r + 1;
    }
  3. For a = 1 to
    if (x - a)nxn - a (mod xr − 1,n), output composite;
  4. Output prime;

Optimization[edit]

This is Lenstra/Pomerance variant of improved AKS:

  1. Find the smallest r such that Or(n) > log2(n)
  2. For a = 1 to r
    if(gcd(a, n) ≠ 1) output composite;
  3. For a = 1 to
    if (x + a)nxn+a (mod xr − 1,n), output prime;

Implementation details[edit]

From results of testing we learn that the step of raising polynomials to the n'th power is the most time consuming(>99.9%), so the implementation of operations with polynomials is the most important part. The implementation in question was written in java, each polynomial being presented as a TreeMap with BigInteger coefficients and size. Amount of operations required to calculate one iteration of the last for can be estimated as follows: log(n) * r2 * log2(n)=r2log3(n), for the number of polynomial multiplications, the number of operations to multiply two polynomials of r order, and number of operations to multiply two BigIntegers, respectively.

Here is the dependency t(n) for the first 1000 numbers in logarithmic scale:

As you can see, in this range time is not polinomial, but exponential. It is so, possible, because of r being of the order of n, not log(n), while for big n r becomes O(log6(n)), as shown in theory. Nevertheless, this result can show the order of time needed to use AKS algorithm and hence the the range of its usability for now.