Message409535
Maybe it's Linux specific? I managed to run pyperformance and got this:
### python_startup ###
Mean +- std dev: 23.2 ms +- 0.8 ms -> 23.4 ms +- 1.2 ms: 1.01x slower
Not significant
Note, I am not dismissing the report -- in fact it looks quite bad. But I am failing to reproduce it, which makes it harder to understand the root cause. :-(
Maybe we can create a microbenchmark for this that just parses a large amount of code?
Anyway, here's a random thought about why this might have such a big impact. Look at this snippet (found all over the parser.c file):
if (p->level++ == MAXSTACK) {
p->error_indicator = 1;
PyErr_NoMemory();
}
if (p->error_indicator) {
p->level--;
return NULL;
}
This is two "unlikely" branches in a row, and the first sets the variable tested by the second. Maybe this causes the processor to stall?
Also, maybe it would be wiser to use ++X instead of X++? (Though a good compiler would just change X++ == Y into ++X == Y+1.)
Anyway, without a way to reproduce, there's not much that can be done. |
|
Date |
User |
Action |
Args |
2022-01-02 22:49:21 | gvanrossum | set | recipients:
+ gvanrossum, terry.reedy, eric.smith, lukasz.langa, eric.snow, serhiy.storchaka, Kojoley, lys.nikolaou, pablogsal, xtreak, Dennis Sweeney, charles.mcmarrow.4 |
2022-01-02 22:49:21 | gvanrossum | set | messageid: <1641163761.13.0.458824030059.issue46110@roundup.psfhosted.org> |
2022-01-02 22:49:21 | gvanrossum | link | issue46110 messages |
2022-01-02 22:49:20 | gvanrossum | create | |
|