This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author mgorny
Recipients Julian, aduncan, davin, itamarst, itamarst2, mgorny, pitrou, wim.glenn
Date 2022-02-11.16:13:53
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
After updating PyPy3 to use Python 3.9's stdlib, we hit very bad hangs because of this — literally compiling a single file with "parallel" compileall could hang.  In the end, we had to revert the change in how Python 3.9 starts workers because otherwise multiprocessing would be impossible to use:

This is a very bad default and what's even worse is that it often causes deadlocks that are hard to reproduce or debug.  Furthermore, since "fork" is the default, people are unintentionally relying on its support for passing non-pickleable projects and are creating non-portable code.  The code often becomes complex and hard to change before they discover the problem.

Before we managed to figure out how to workaround the deadlocks in PyPy3, we were experimenting with switching the default to "spawn".  Unfortunately, we've hit multiple projects that didn't work with this method, precisely because of pickling problems.  Furthermore, they were surprised to learn that their code wouldn't work on macOS (in the end, many people perceive Python as a language for writing portable software).

Finally, back in 2018 I've made one of my projects do parallel work using multiprocessing.  It gave its users great speedup but for some it caused deadlocks that I couldn't reproduce nor debug.  In the end, I had to revert it.  Now that I've learned about this problem, I'm wondering if this wasn't precisely because of "fork" method.
Date User Action Args
2022-02-11 16:13:53mgornysetrecipients: + mgorny, pitrou, Julian, wim.glenn, itamarst, davin, itamarst2, aduncan
2022-02-11 16:13:53mgornysetmessageid: <>
2022-02-11 16:13:53mgornylinkissue40379 messages
2022-02-11 16:13:53mgornycreate