This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: Parallel compilation fails because of low ulimit.
Type: Stage: resolved
Components: Build Versions: Python 3.8
process
Status: closed Resolution: not a bug
Dependencies: Superseder:
Assigned To: Nosy List: kulikjak, pitrou
Priority: normal Keywords:

Created on 2019-06-11 13:10 by kulikjak, last changed 2022-04-11 14:59 by admin. This issue is now closed.

Messages (6)
msg345232 - (view) Author: Jakub Kulik (kulikjak) * Date: 2019-06-11 13:10
When building and installing Python 3.8 on our sparc machine, the build breaks during the compileall stage with [Error 24] Too many open files. The problem is due to the recently enabled parallel compilation (issue36786).

When -j0 is passed to the compileall, executor starts as many threads as there are cpus, which is problem on our sparc machine with 512 CPUs and rather low ulimit on number of opened files. Compilation fails and is not able to recover from that. For the same reason, test_compileall also breaks and never ends.

I am able to fix both of these by patching both Makefile.pre.in and test_compileall.py with -j32 (instead of -j0), but that is not the best solution.

I think that the best solution to this would be to make compileall.py check for this limit and consider that when creating ProcessPoolExecutor (or, maybe even better, to make it recover when this problem occurs).
msg345233 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2019-06-11 14:07
Can't you just up the ulimit? What the current ulimit for your user?
msg345235 - (view) Author: Jakub Kulik (kulikjak) * Date: 2019-06-11 15:18
We have a limit of 256 opened files, which is not much, but I can up it and then it doesn't happen.

Mainly, I wanted to report that this might be happening now. But I guess not many people will face this problem.
msg345236 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2019-06-11 15:20
I guess you'd get a similar problem if you were compiling on all cores simultaneously?  256 simultaneous open files is very low.
msg345237 - (view) Author: Jakub Kulik (kulikjak) * Date: 2019-06-11 15:24
I am not sure what you are asking now. compileall with -j0 does compile on all cores simultaneously right?
msg383501 - (view) Author: Jakub Kulik (kulikjak) * Date: 2020-12-21 09:39
I am closing this as it is true that the limit on files of 256 is pretty low, and no matter how robust you make it, arbitrarily low ulimits will crash it anyway.
History
Date User Action Args
2022-04-11 14:59:16adminsetgithub: 81413
2020-12-21 09:39:23kulikjaksetstatus: open -> closed
resolution: not a bug
messages: + msg383501

stage: resolved
2019-06-11 15:24:16kulikjaksetmessages: + msg345237
2019-06-11 15:20:23pitrousetmessages: + msg345236
2019-06-11 15:18:47kulikjaksetmessages: + msg345235
2019-06-11 14:07:54pitrousetmessages: + msg345233
2019-06-11 14:06:47ned.deilysetnosy: + pitrou
2019-06-11 13:10:04kulikjakcreate