Message364757
A pre-computed table of primes might be better. Of course, how long should the table be. There's an infinity of primes.
Consider
>>> 2**32
4294967296
This number is approximately 4 * (10**9). According to https://en.wikipedia.org/wiki/Prime_number_theorem, there are 50,847,534 primes less than 10**9. So, very roughly, there are 200,000,000 primes less than 2**32.
Thus, storing a list of all these prime numbers as 32 bit unsigned integers would occupy about
>>> 200_000_000 / (1024**3) * 4
0.7450580596923828
or in other words 3/4 gigabytes on disk.
A binary search into this list, using as starting point the expected location provided by the prime number theorem, might very well require on average less than two block reads into the file that holds the prime number list on disk. And if someone needs to find primes of this size, they've probably got a spare gigabyte or two.
I'm naturally inclined to this approach because by mathematical research involves spending gigahertz days computing tables. I then use the tables to examine hypotheses. See https://arxiv.org/abs/1011.4269. This involves subsets of the vertices of the 5-dimensional cube. There are of course 2**32 such subsets. |
|
Date |
User |
Action |
Args |
2020-03-21 16:11:06 | jfine2358 | set | recipients:
+ jfine2358, tim.peters, mark.dickinson, steven.daprano, serhiy.storchaka, trrhodes |
2020-03-21 16:11:06 | jfine2358 | set | messageid: <1584807066.66.0.112575835754.issue40028@roundup.psfhosted.org> |
2020-03-21 16:11:06 | jfine2358 | link | issue40028 messages |
2020-03-21 16:11:05 | jfine2358 | create | |
|