Message385572
I did some profiling (attached a few files here with svgs) of running this script:
```python
import io
import tokenize
# picked as the second longest file in cpython
with open('Lib/test/test_socket.py', 'rb') as f:
bio = io.BytesIO(f.read())
def main():
for _ in range(10):
bio.seek(0)
for _ in tokenize.tokenize(bio.readline):
pass
if __name__ == '__main__':
exit(main())
```
the first profile is before the optimization, the second is after the optimization
The optimization takes the execution from ~6300ms to ~4500ms on my machine (representing a 28% - 39% improvement depending on how you calculate it)
(I'll attach the pstats and svgs after creation, seems I can only attach one file at once) |
|
Date |
User |
Action |
Args |
2021-01-24 08:34:14 | Anthony Sottile | set | recipients:
+ Anthony Sottile |
2021-01-24 08:34:14 | Anthony Sottile | set | messageid: <1611477254.42.0.0381684920705.issue43014@roundup.psfhosted.org> |
2021-01-24 08:34:14 | Anthony Sottile | link | issue43014 messages |
2021-01-24 08:34:14 | Anthony Sottile | create | |
|