Guido van Rossum <[email protected]> added the comment:
Maybe it's Linux specific? I managed to run pyperformance and got this:
### python_startup ###
Mean +- std dev: 23.2 ms +- 0.8 ms -> 23.4 ms +- 1.2 ms: 1.01x slower
Not significant
Note, I am not dismissing the report -- in fact it looks quite bad. But I am
failing to reproduce it, which makes it harder to understand the root cause. :-(
Maybe we can create a microbenchmark for this that just parses a large amount
of code?
Anyway, here's a random thought about why this might have such a big impact.
Look at this snippet (found all over the parser.c file):
if (p->level++ == MAXSTACK) {
p->error_indicator = 1;
PyErr_NoMemory();
}
if (p->error_indicator) {
p->level--;
return NULL;
}
This is two "unlikely" branches in a row, and the first sets the variable
tested by the second. Maybe this causes the processor to stall?
Also, maybe it would be wiser to use ++X instead of X++? (Though a good
compiler would just change X++ == Y into ++X == Y+1.)
Anyway, without a way to reproduce, there's not much that can be done.
----------
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue46110>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com