This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author pitrou
Recipients eric.araujo, ezio.melotti, nadeem.vawda, ncoghlan, pitrou, tarek, terry.reedy
Date 2011-10-11.13:37:04
SpamBayes Score 0.021617
Marked as misclassified No
Message-id <>
Actually, a big part of that is compiling some regexes in the tokenize module. Just relying on the re module's internal caching shaves off 20% of total startup time.


$ time ./python -S -c 'import tokenize'

real	0m0.034s
user	0m0.030s
sys	0m0.003s
$ time ./python -c ''

real	0m0.055s
user	0m0.050s
sys	0m0.005s


$ time ./python -S -c 'import tokenize'

real	0m0.021s
user	0m0.019s
sys	0m0.001s
$ time ./python -c ''

real	0m0.044s
user	0m0.038s
sys	0m0.006s
Date User Action Args
2011-10-11 13:37:05pitrousetrecipients: + pitrou, terry.reedy, ncoghlan, nadeem.vawda, tarek, ezio.melotti, eric.araujo
2011-10-11 13:37:05pitrousetmessageid: <>
2011-10-11 13:37:04pitroulinkissue13150 messages
2011-10-11 13:37:04pitroucreate