How Many Primes Did We Miss?
The mathematics world is abuzz with the verification of a new largest known prime number. The number, , is a Mersenne Prime, and has over 17 million digits. The previous largest known prime was , which had a mere 12.9 million or so digits.
It’s interesting to note that, while it has been known for thousands of years that there are infinitely many primes, it is a challenge even today to find large ones.
It is also interesting to note how many primes were missed in jumping from the previous largest prime to this new largest prime.
A well-known and elegant result, Bertrand’s Postulate, states that there is always a prime number between n and 2n, for n > 1. For example, the prime 3 is between 2 and 4; the prime 5 is between 3 and 6; the prime 11 is between 10 and 20; and so on.
In particular, this says that there must be a prime between and , since .
Thus, there must be a prime between and , and another between and , and so on!
Thus, there are at least 57,885,161 – 43,112,609 = 14,772,552 primes between and ! We can therefore safely say there are at least 14,772,551 primes between the largest and second-largest known primes!
Let’s hope it’s not another 4 years until we have a new largest prime on the block.
4 Comments
Alan Eliasen · February 7, 2013 at 2:59 am
It’s certainly true that there are certainly at least that many primes between those numbers, but the estimate of how many there primes there actually are is so much larger as to be staggering!
The Prime Number Theorem tells us that the average probability for a random positive integer “n” to be prime is approximately 1/ln[n]. (Where ln is the natural log function.) So, the average number of prime numbers smaller than n is approximated roughly by n/ln[n].
If you work these numbers out, this gives us an approximate estimate that there are:
2^57885161 / (57885161 ln[2])
prime numbers in that range! This is the absolutely staggeringly huge number
1.45 * 10^17,425,162
of prime numbers likely to be found between those numbers! This number is staggeringly larger than the estimate above.
Note that I didn’t even try to subtract out the number of primes below 2^43112609 because it’s vanishingly small in comparison. (It’s about 1.06 * 10^12,978,181 which means that its effect would only show up almost 5 million decimal places later.)
You can get a better estimate of primes using the “log integral” function, but this gives us a fair ballpark with very little work.
MrHonner · February 7, 2013 at 10:02 pm
Thanks for the thoughtful analysis, Alan! I had played around with the Prime Number Theorem myself, but I like your sensible estimates of these enormous numbers.
I love the simplicity and elegance of Bertrand’s postulate, even if it only succeeds in creating a [comically undersized!] lower bound. But, the PNT is only really an estimate; how much can we trust those numbers? 🙂
Alan Eliasen · February 8, 2013 at 4:55 am
Well, we know that as “n” goes to infinity, the bound I gave ( n / ln[n] ) approaches the actual count of prime numbers. (That is, their quotient goes to 1 as n goes to infinity.) So it’s a fairly good approximation, but we know there are better approximations if we want to work harder.
There are some better bounds here:
https://en.wikipedia.org/wiki/Prime_number_theorem
However, we’re in the ballpark with our quick estimate. That page (see the “Bounds on the Prime Counting Function” section) gives the bound (for n>55) that the actual count of prime numbers is greater than
n / (ln[n] + 2)
Which gives
2^57885161 / (57885161 ln[2] + 2)
which works out to
1.450260856 * 10^17425162
And the number of primes is less than the upper bound:
n / (ln[n] – 4)
or
2.^57885161 / (57885161 ln[2] – 4)
which gives
1.450261073 * 10^17425162
So, to line these up, we see that the bounds are:
1.450260856 * 10^17425162 (lower)
1.450261073 * 10^17425162 (upper)
Our upper and lower bounds are very close to each other indeed, which shows that our estimate is known to accurate to at least 6 decimal places! Not bad for a simple estimate like this!
Note that I used the “weaker but sometimes useful bound” as mentioned in that page. It’s a fun exercise in working with these huge numbers to work out the even tighter bound above it. That’s “left as an exercise for the reader” but I might try again to estimate it.
Note that this estimate does not subtract out the number of primes below 2^43112609, because they’re again so negligible! We would have to work out almost 5 million decimal places of our estimate before its contribution even showed up!
Alan Eliasen · February 8, 2013 at 5:26 am
Okay, I worked out the stronger bound. Here’s tighter bounds on the number of primes, using the first equation given at https://en.wikipedia.org/wiki/Prime_number_theorem#Bounds_on_the_prime-counting_function:
1.450260964589077348 * 10^17425162 (lower)
1.450260964589079609 * 10^17425162 (upper)
So it appears that our estimate is good to more than 15 decimal places! The Prime Number Theorem estimate apparently gets very good when numbers get this large!
(Well, the *difference* between subtracting those numbers might be an incredibly huge number, but their ratio is very close to 1! 🙂 )