This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author mouse07410
Recipients Ryan May, barry, davin, gregory.p.smith, josh.r, kapilt, lukasz.langa, miss-islington, mouse07410, ned.deily, pablogsal, pitrou, ronaldoussoren, tdsmith, vstinner
Date 2020-03-29.10:42:25
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1585478545.75.0.00410545402563.issue33725@roundup.psfhosted.org>
In-reply-to
Content
The fix applied for this problem actually broke multiprocessing on MacOS. The change to the new default 'spawn' from 'fork' causes program to crash in spawn.py with `FileNotFoundError: [Errno 2] No such file or directory`.

I've tested this on MacOS Catalina 10.15.3 and 10.15.4, with Python-3.8.2 and Python-3.7.7.

With Python-3.7.7 everything works as expected.

Here's the output:
{{{
$ python3.8 multi1.py 
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/spawn.py", line 126, in _main
    self = reduction.pickle.load(from_parent)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/multiprocessing/synchronize.py", line 110, in __setstate__
    self._semlock = _multiprocessing.SemLock._rebuild(*state)
FileNotFoundError: [Errno 2] No such file or directory
}}}

Here's the program:
{{{
#!/usr/bin/env python3
#
# Test "multiprocessing" package included with Python-3.6+
#
# Usage:
#    ./mylti1.py [nElements [nProcesses [tSleep]]]
#
#        nElements  - total number of integers to put in the queue
#                     default: 100
#        nProcesses - total number of parallel processes/threads
#                     default: number of physical cores available
#        tSleep     - number of milliseconds for a thread to sleep
#                     after it retrieved an element from the queue
#                     default: 17
#
# Algorithm:
#   1. Creates a queue and adds nElements integers to it,
#   2. Creates nProcesses threads
#   3. Each thread extracts an element from the queue and sleeps for tSleep milliseconds
#

import sys, queue, time
import multiprocessing as mp


def getElements(q, tSleep, idx):
    l = []  # list of pulled numbers
    while True:
        try:
            l.append(q.get(True, .001))
            time.sleep(tSleep)
        except queue.Empty:
            if q.empty():
                print(f'worker {idx} done, got {len(l)} numbers')
                return


if __name__ == '__main__':
    nElements = int(sys.argv[1]) if len(sys.argv) > 1 else 100
    nProcesses = int(sys.argv[2]) if len(sys.argv) > 2 else mp.cpu_count()
    tSleep = float(sys.argv[3]) if len(sys.argv) > 3 else 17

    # Uncomment the following line to make it working with Python-3.8+
    #mp.set_start_method('fork')

    # Fill the queue with numbers from 0 to nElements
    q = mp.Queue()
    for k in range(nElements):
        q.put(k)

    # Start worker processes
    for m in range(nProcesses):
        p = mp.Process(target=getElements, args=(q, tSleep / 1000, m))
        p.start()
}}}
History
Date User Action Args
2020-03-29 10:42:25mouse07410setrecipients: + mouse07410, barry, gregory.p.smith, ronaldoussoren, pitrou, vstinner, ned.deily, lukasz.langa, josh.r, tdsmith, davin, Ryan May, pablogsal, miss-islington, kapilt
2020-03-29 10:42:25mouse07410setmessageid: <1585478545.75.0.00410545402563.issue33725@roundup.psfhosted.org>
2020-03-29 10:42:25mouse07410linkissue33725 messages
2020-03-29 10:42:25mouse07410create