To test whether a number is prime or not, why do we have to test whether it is divisible only up to the square root of that number?
If a number n
is not a prime, it can be factored into two factors a
and b
:
n = a * b
Now a
and b
can't be both greater than the square root of n
, since then the product a * b
would be greater than sqrt(n) * sqrt(n) = n
. So in any factorization of n
, at least one of the factors must be smaller than the square root of n
, and if we can't find any factors less than or equal to the square root, n
must be a prime.
Because if a factor is greater than the square root of n, the other factor that would multiply with it to equal n is necessarily less than the square root of n.
Given any number n
, then one way to find its factors is to get its square root p
:
sqrt(n) = p
Of course, if we multiply p
by itself, then we get back n
:
p*p = n
It can be re-written as:
a*b = n
Where p = a = b
. If a
increases, then b
decreases to maintain a*b = n
. Therefore, p
is the upper limit.
Update: I am re-reading this answer again today and it became clearer to me more. The value p
does not necessarily mean an integer because if it is, then n
would not be a prime. So, p
could be a real number (ie, with fractions). And instead of going through the whole range of n
, now we only need to go through the whole range of p
. The other p
is a mirror copy so in effect we halve the range. And then, now I am seeing that we can actually continue re-doing the square root
and doing it to p
to further half the range.
Suppose n
is not a prime number (greater than 1). So there are numbers a
and b
such that
n = ab (1 < a <= b < n)
By multiplying the relation a<=b
by a
and b
we get:
a^2 <= ab
ab <= b^2
Therefore: (note that n=ab
)
a^2 <= n <= b^2
Hence: (Note that a
and b
are positive)
a <= sqrt(n) <= b
So if a number (greater than 1) is not prime and we test divisibility up to square root of the number, we will find one of the factors.
Let n be non-prime. Therefore, it has at least two integer factors greater than 1. Let f be the smallest of n's such factors. Suppose f > sqrt n. Then n/f is an integer LTE sqrt n, thus smaller than f. Therefore, f cannot be n's smallest factor. Reductio ad absurdum; n's smallest factor must be LTE sqrt n.
A more intuitive explanation would be :-
The square root of 100 is 10. Let's say a x b = 100, for various pairs of a and b.
If a == b, then they are equal, and are the square root of 100, exactly. Which is 10.
If one of them is less than 10, the other has to be greater. For example, 5 x 20 == 100. One is greater than 10, the other is less than 10.
Thinking about a x b, if one of them goes down, the other must get bigger to compensate, so the product stays at 100. They pivot around the square root.
The square root of 101 is about 10.049875621. So if you're testing the number 101 for primality, you only need to try the integers up through 10, including 10. But 8, 9, and 10 are not themselves prime, so you only have to test up through 7, which is prime.
Because if there's a pair of factors with one of the numbers bigger than 10, the other of the pair has to be less than 10. If the smaller one doesn't exist, there is no matching larger factor of 101.
If you're testing 121, the square root is 11. You have to test the prime integers 1 through 11 (inclusive) to see if it goes in evenly. 11 goes in 11 times, so 121 is not prime. If you had stopped at 10, and not tested 11, you would have missed 11.
You have to test every prime integer greater than 2, but less than or equal to the square root, assuming you are only testing odd numbers.
`