Jump to content

Talk:Primality test

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 99.142.30.54 (talk) at 03:25, 29 August 2012. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiProject iconComputing Unassessed
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
???This article has not yet received a rating on Wikipedia's content assessment scale.
???This article has not yet received a rating on the project's importance scale.
WikiProject iconComputer science Start‑class Mid‑importance
WikiProject iconThis article is within the scope of WikiProject Computer science, a collaborative effort to improve the coverage of Computer science related articles on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-importance on the project's importance scale.
Things you can help WikiProject Computer science with:

Naïve Methods and 4

When testing whether n is divisible by m, using values of m from 2 to n - 1, when n is 4, the method correctly determines that 4 is not prime, because it is divisible by 2. However, when using values of m from 2 to the square root of n, when n is 4, an error can occur. Since 2 is the square root of 4, the test range is 2 to 2, and, if the range is exclusive, like a for-loop "for (n = 2; n < sqrt(m); n++)", the whole test is skipped, and 4 is incorrectly determined to be a prime number.
68.13.47.75 (talk) 16:09, 10 January 2008 (UTC)[reply]

"The Miller-Rabin primality test and Solovay-Strassen primality test are more sophisticated variants which detect all composites"

If the methods detect all composites, why are they categorized as probabilistic rather than deterministic? Joestynes 08:31, 2 Mar 2005 (UTC)
These two categories are unrelated. An algorithm is probabilistic if it makes random "coins tosses" during the computation, and produces the correct answer with high probability. An algorithm is approximative if it produces the correct answer only for (an overwhelming) majority of inputs. You can have a deterministic approximative test for primality (e.g., base-2 Fermat test), and you can have a probabilistic exact test (e.g., Rabin-Miller). The "probability" in "probabilistic test" refers to the internal coin tosses, not to some probability distribution on the inputs. -- EJ 13:49, 2 Mar 2005 (UTC)
What the text should probably say is that the tests may report a composite to be prime, but never report a prime to be composite. Fredrik | talk 13:55, 2 Mar 2005 (UTC)
Maybe the headings should be changed to "Approximative" and "Deterministic", rather than "Probabilistic" and "Fast Deterministic", to avoid creating a false dichotomy. Joestynes 01:28, 3 Mar 2005 (UTC)
This would be wrong. As I explained above, most of the probabilistic tests mentioned in the article are NOT approximative, and approximative vs. deterministic is not a dichotomy at all. -- EJ 12:24, 3 Mar 2005 (UTC)

The text states:

"A simple, but very inefficient primality test uses Wilson's theorem, which states that p is prime if and only if:

   (p-1)!\ \equiv\ -1\ (\mbox{mod}\ p)

Although this method requires p modular multiplications, rendering it impractical, theorems about primes and modular residues form the basis of many more practical methods." (11/6/09)

Apart from the bottom sentence being a fragment, I don't think the test requires p modular multiplications. Take 7 for an example. 7-1 = 6, and 6! = 6*5*4*3*2. 6! seems to have 4 modular multiplications: (6*5) being 1, (30*4) being 2, (120*3) being 3, and (360*2) being 4.


—Preceding unsigned comment added by 76.16.199.113 (talk) 23:06, 11 June 2009 (UTC)[reply]

Okay, actually it's p−2, but this makes no substantive difference. I'll just make it say "about p". Dcoetzee 23:16, 11 June 2009 (UTC)[reply]

Computer proof vs pen-and-paper proof

The article says the following:

This is orders of magnitude less than the probability that the computation will be corrupted by hardware errors, software bugs, mistyping the input, some user's death due to heart attack during the computation, or an airplane crashing on your computer lab. All of these affect deterministic tests as well... If you want to know with certainty that a number is prime, you cannot use a computer at all; you have to find a mathematical proof of its primality, short and simple enough that you can write it down on a piece of paper and verify by hand that it is correct.

I disagree with this, or at least, I think it's expressing a POV. Surely a human prover can also make an error in his proof, and in his subsequent verification? I realise that this is a debatable (and philosophical) matter, but I think that the above, particularly the bold phrase, needs to be reworded. Personally, for various particular combinations of humans, computers and numbers, I would trust a computer proof of primality over a human proof ;-) — Matt Crypto 22:08, 22 May 2005 (UTC)[reply]

I have removed the following section (between the rules). Even though it is useful information (pace NPOV concerns) end even though my queries above may have prompted its addition, it doesn't really belong on the primality testing page; it belongs on Randomized algorithm, Approximation algorithm and/or Deterministic algorithm. Primality tests are particular instances of these, so appropriate links and categorization should suffice. Of course, each listed method should state both whether it's deterministic or randomised and whether it's approximation or definite. I'm not qualified to do that. Joestynes 08:18, 3 Jun 2005 (UTC)

Popular misconceptions about randomized algorithms

  • A randomized test is good for most numbers, but a small number of composites will be declared prime. — this is not the case. The test is only required to succeed with certain probability, but the probability is not taken over the inputs, but over additional auxiliary numbers used in the algorithm. The test must be correct with high probability for every input number n. Having said that, it may be also worthwhile to consider tests which are incorrect on a certain fraction of inputs; such algorithms are called heuristic or approximation algorithms. Notice that heuristic algorithms may be deterministic as well as randomized; it's a different kind of classification.
  • A randomized test does not determine with certainty whether a number is prime or not. Deterministic tests are no better in this respect. If you make, say, 25 iterations of the Miller-Rabin tests, the algorithm as such is correct with probability smaller than 10−15. This is orders of magnitude less than the probability that the computation will be corrupted by hardware errors, software bugs, mistyping the input, some user's death due to heart attack during the computation, or an airplane crashing on your computer lab. All of these affect deterministic tests as well (in fact, for general numbers, the deterministic test will take far longer than a reasonable number of randomized test iterations, and hence be more exposed to hardware errors). If you want to know with certainty that a number is prime, you cannot use a computer at all; you have to find a mathematical proof of its primality, short and simple enough that you can write it down on a piece of paper and verify by hand that it is correct. Of course, your proof may employ algorithms as mathematical entities; but then again, deterministic and randomized tests are no different: if you prove that the Miller-Rabin test declares a number n as prime with probability at least 1/2, it is a valid proof that n is prime.

In perfect world, appropriate links suffice. In reality, people do not follow links, especially if they have a false impression that the link explains something well-known or obvious. Moving the section to Randomized algorithm would be useless, as all the information is already there. EJ 12:14, 9 Jun 2005 (UTC)
I think it wouldn't hurt to briefly summarize some of the concerns about probabilistic algorithms in this context, since it's where many people first encounter probabilistic algorithms - then we tell them to follow the link for more information. Deco 03:00, 21 June 2006 (UTC)[reply]

"cyclotomy test"

There's no such link in Wikipedia, and I don't see anything substantive about a "cyclotomy test" on the web except the erroneous link in this article! For the time being I've linked to "Root of Unity", which is a redirect from cyclotomy 14:25, 21 December 2006 (UTC)

Search on "cyclotomy primality proving". PrimeHunter 20:30, 21 December 2006 (UTC)[reply]

Hardy-Littlewood conjecture

There are several statements which go under that name. The one relevant to AKS is about the distribution of Sophie Germain primes, whereas Hardy-Littlewood conjecture currently redirects to an unrelated statement about the distribution of twin primes. So, please, do not mess with the piped wikilink. -- EJ 14:34, 14 January 2007 (UTC)[reply]

Primality testing using Primorials

There is a straightforward, unsophisticated method of deterministic primality testing involving the primorials, though I have no idea whether it is "fast". It accomodates any and all prime numbers of any size without false positives or gaps when the method is done properly.--Billymac00 15:38, 2 June 2007 (UTC)[reply]

Could you be thinking about Wilson's theorem which uses factorials (not primorials)? It is far too slow to be useful. PrimeHunter 16:08, 2 June 2007 (UTC)[reply]
A number n is prime iff gcd(n, (n - 1)#) = 1. But that's every bit as bad as Wilson's theorem. CRGreathouse (t | c) 21:02, 29 January 2009 (UTC)[reply]

Probabilistic testing

When I read about a percentage being associated with probabilistic testing being correct (99.xx%), this might imply there are known documented failures of this testing. Does anyone know if this is the case, and if so, what are a few specific examples, for say Miller-Rabin? Or, is the percentage purely theoretically based, and there are no known, documented false primes reported in published work?--Billymac00 01:35, 17 June 2007 (UTC)[reply]

Yes, many tests can say "probable prime" about a composite and have known examples of that. See Category:Pseudoprimes and Miller-Rabin primality test. See also http://mathworld.wolfram.com/Rabin-MillerStrongPseudoprimeTest.html for some composites which are not detected by a few Miller-Rabin tests with small bases. PrimeHunter 01:52, 17 June 2007 (UTC)[reply]
If you were to ask, has a failure ever occurred in practice that significantly impacted the world, such as damaging a cryptographic scheme, the answer is almost certainly no. They set the error extremely tiny for any case where it matters, and I'm not aware of any documented failure to this effect. The fact that such failures are possible is proven mathematically. Dcoetzee 08:08, 17 June 2007 (UTC)[reply]

Naïve Methods Generalization

"Generalising further, it can be seen that all primes are of the form c#k + i for i < c# where i represents the numbers that are coprime to c# and where c and k are integers. For example, let c = 6. Then c# = 2 \cdot 3 \cdot 5 = 30. All integers are of the form 30k + i for i = 0, 1, 2,...,29 and k an integer. However, 2 divides 0, 2, 4,...,28 and 3 divides 0, 3, 6,...,27 and 5 divides 0, 5, 10,...,25. So all prime numbers are of the form 30k + i for i = 1, 7, 11, 13, 17, 19, 23, 29 (i.e. for i < 30 such that gcd(i,30) = 1). Note that if i and 30 are not coprime, then 30k + i is divisible by a prime divisor of 30, namely 2, 3 or 5, and is therefore not prime."

This part doesn't seem to be correct. Given the example, if c=6, and the rule is 30k + i, with i=1, 7, 11, 13, 17, 19, 23, 29, then we see that for k=1 and i=19, 30k+i=49 = 7x7, which is not a prime.

Prajo (talk) 17:49, 30 June 2009 (UTC)[reply]

You are misreading it. It says that all primes are of the given form, it does not say that all integers of that form are primes. That's the other implication. — Emil J. 17:57, 30 June 2009 (UTC)[reply]

Introduction

"As of 2009[update], factorization is a computationally hard problem, whereas primality testing is comparatively easy (its running time is polynomial)." Factorization also has a polynomial running time. This sentence is misleading. I offer to delete or change "(its running time is polynomial)" part. —Preceding unsigned comment added by 188.3.128.114 (talk) 13:33, 10 July 2009 (UTC)[reply]

No, there is no known factorization algorithm which can produce the factorization in polynomial time in the size of the input, meaning in polynomial time of log(n) where n is the number to be factored. See more at Integer factorization#Difficulty and complexity. Don't make this change. PrimeHunter (talk) 13:57, 10 July 2009 (UTC)[reply]

Really?

In the section Naive Methods "A good way to speed up these methods (and all the others mentioned below) is to pre-compute and store a list of all primes up to a certain bound, say all primes up to 200."

This is a small point, and I may be wrong about this, but I was under the impression that with a modern processor it took less time to compute this list during run time than to fetch it from a hard-disk. Unless you mean "pre-compute during the program and save into memory" instead of "read from a file"; but this sentence indicated the latter to me on first read. 128.120.218.130 (talk) —Preceding undated comment added 23:19, 16 July 2009 (UTC).[reply]

It depends what you mean "from a file" - this may be programmed into the compiler and/or compiled into the program. A variety of common mathematical operations are often stored non-mathematically since they are repeated so many times in so many contexts. Moreover, even if prime number detection is polynomial, that doesn't beat constant time detection. Jztinfinity (talk) 13:21, 2 August 2011 (UTC)[reply]

History section?

I would find a history section of this article interesting and helpful. When learning something new, I find it helpful to study historical beginnings along with logical beginnings. I find this also lends insight into the "naive" ways of thinking and generally makes me feel less stupid starting out.

I don't really know anything about the history of primality testing, but I guess the written record begins with Euclid in Book V of the Elements. If a history section would be helpful to anyone else, then I'm all for working with people on it. —Preceding unsigned comment added by 67.164.32.152 (talk) 13:06, 23 April 2011 (UTC)[reply]

Trial division

I think the primality test should be anything from sqrt(n-1) to (1/2)*n. Anything else can be discarded. --Mnoon (talk) 20:59, 28 September 2011 (UTC)[reply]

I guess you refer to trial division. sqrt(n-1) to (1/2)*n would be possible but extremely inefficient. 2 to sqrt(n-1) is far faster because there are far fewer numbers to test. For example, if n = 1010 then sqrt(n-1) to (1/2)*n is 100,000 to 5,000,000,000 instead of 2 to 100,000. If m is an integer from sqrt(n-1) to (1/2)*n and m divides n, then n/m is an integer from 2 to sqrt(n) and n/m divides n, so the same factors can be found by testing the two intervals. PrimeHunter (talk) 22:50, 28 September 2011 (UTC)[reply]

The Fastest Deterministic Primality Test

How do I introduce the fastest Deterministic Primality Test Method? I created a 1970's program which outlines the method in a mere 70 lines; it utilizes the method devised by John L. Selfridge without the need for a decision tree. It is by far the quickest and simplest test out there...

100 CLS : C = 1 102 DIM A(10001), LT(1), T(10001) 104 FOR N = 3 TO 7919 STEP 2 105 IF N MOD 3 <> 0 AND N MOD 5 <> 0 AND N <> 7 THEN 106 REM *** Compute the numbered terms of the L-Sequence *** 108 A(0) = N: A(1) = (N + 1) / 2: A(2) = A(1) - 1 110 FOR I = 3 TO 10001 STEP 2 112 IF A(I - 1) < 3 THEN 114 IF A(I - 1) = 2 THEN 116 A(I) = 1: A(I + 1) = 0 118 M = I + 1: I = 10001 120 ELSE 122 A(I) = 0: M = I: I = 10001 124 END IF 126 ELSE 128 IF A(I - 1) MOD 2 = 1 THEN 130 A(I) = A(I - 2) / 2 132 ELSE 134 A(I) = (A(I - 2) + 1) / 2 136 END IF 138 A(I + 1) = A(I) - 1 140 END IF 142 NEXT I 143 REM *** I think there's more parity change near, on, or after ((N-1)/4) *** 144 D = 0: L = INT((N - 1)/4): LL = L: LT(0) = 2: LT(1) = 2 146 WHILE (L < N - 2) 148 T(M) = 2: T(M - 1) = L: T(M - 2) = (T(M - 1) ^ 2 - 2) MOD N 150 IF T(M - 2) < 2 THEN T(M - 2) = T(M - 2) + N 152 IF T(M - 2) <> 2 THEN 154 K = 0: Z1 = 0: Z2 = 0 155 REM *** Compute the values of the terms in the L-Sequence *** 156 FOR J = M - 3 TO 0 STEP -1 158 IF A(J) MOD 2 = 1 THEN 160 IF A(J) = A(J + 1) + A(J + 2) THEN K = 0 ELSE K = 1 162 T(J) = (T(J + 1 + K) * T(J + 2 + K) - L) MOD N 164 ELSE 166 T(J) = (T(J + 2) ^ 2 - 2) MOD N 168 END IF 170 IF T(J) < 2 THEN T(J) = T(J) + N 172 NEXT J 174 Z1 = (T(2) ^ 2 - 2) MOD N 176 IF Z1 < 2 THEN Z1 = Z1 + N 178 Z2 = (T(1) ^ 2 - 2) MOD N 180 IF Z2 < 2 THEN Z2 = Z2 + N 181 REM *** Evaluate whether N is prime or composite *** 182 IF T(0) = L THEN 184 IF Z1 = 2 AND Z2 = T(M - 2) THEN 186 LT(D) = -1 188 ELSE 190 IF Z1 = T(M - 2) AND Z2 = 2 THEN 192 LT(D) = 1 194 ELSE 195 IF LMAX < L - LL THEN LMAX = L - LL 196 PRINT N; "is composite!": L = N - 2 198 END IF 200 END IF 202 ELSE 203 IF LMAX < L - LL THEN LMAX = L - LL 204 PRINT N; "is composite!": L = N - 2 206 END IF 208 IF LT(0) = -LT(1) THEN 209 IF LMAX < L - LL THEN LMAX = L - LL 210 PRINT N; L; "is prime!": L = N - 2: C = C + 1 212 END IF 214 D = 1 216 END IF 218 L = L + 1 220 WEND 222 ELSE 224 IF N = 3 OR N = 5 OR N = 7 THEN 226 PRINT N; "is prime!": C = C + 1 228 ELSE 230 PRINT N; "is composite!" 232 END IF 234 END IF 236 NEXT N 237 REM *** LMAX + 1 is the maximum number of trials in the range *** 238 PRINT C; LMAX + 1 240 END

Enjoy! an eigth-grader could verify this result, and I'm not violating any copyright.99.142.30.54 (talk) 03:25, 29 August 2012 (UTC)[reply]