This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author vstinner
Recipients jbfzs, r.david.murray, rhettinger, serhiy.storchaka, vstinner, Александр Карпинский
Date 2017-11-21.23:54:27
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1511308467.69.0.213398074469.issue27535@psf.upfronthosting.co.za>
In-reply-to
Content
Attached bench_ignore_warn.py: microbenchmark (using the perf module) to measure the performance of emitting a warning when the warning is ignored.

I added two basic optimizations to my PR 4489. With these optimizations, the slowdown is +17% on such microbenchmark:

$ ./python -m perf compare_to ref.json patch.json 
Mean +- std dev: [ref] 903 ns +- 70 ns -> [patch] 1.06 us +- 0.06 us: 1.17x slower (+17%)

The slowdown was much larger without optimizations, +42%:

$ ./python -m perf compare_to ref.json ignore.json 
Mean +- std dev: [ref] 881 ns +- 59 ns -> [ignore] 1.25 us +- 0.08 us: 1.42x slower (+42%)


About the memory vs CPU tradeoff, we are talking around ~1000 ns. IMHO 1000 ns is "cheap" (fast) and emitting warnings is a rare event Python. I prefer to make warnings slower than "leaking" memory (current behaviour: warnings registry growing with no limit).
History
Date User Action Args
2017-11-21 23:54:27vstinnersetrecipients: + vstinner, rhettinger, r.david.murray, serhiy.storchaka, Александр Карпинский, jbfzs
2017-11-21 23:54:27vstinnersetmessageid: <1511308467.69.0.213398074469.issue27535@psf.upfronthosting.co.za>
2017-11-21 23:54:27vstinnerlinkissue27535 messages
2017-11-21 23:54:27vstinnercreate