This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Sean Murphy
Recipients Sean Murphy
Date 2016-12-14.00:51:48
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <>
Python fails to pass a Queue when calling Process with multiprocessing.set_start_method set to "spawn" or "forkserver".

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/lib/python3.5/multiprocessing/", line 106, in spawn_main
    exitcode = _main(fd)
  File "/usr/lib/python3.5/multiprocessing/", line 116, in _main
    self = pickle.load(from_parent)
  File "/usr/lib/python3.5/multiprocessing/", line 111, in __setstate__
    self._semlock = _multiprocessing.SemLock._rebuild(*state)
FileNotFoundError: [Errno 2] No such file or directory

Here is a minimized example:
#!/usr/bin/env python3

import multiprocessing

def check_child(q):
    print("Queue", q)

if __name__ == '__main__':
    # multiprocessing.set_start_method('fork')
    # multiprocessing.set_start_method('forkserver')

    q = multiprocessing.Queue(-1)
    print("q", q)

    proc = multiprocessing.Process(target=check_child, args=(q,))

Also, this fails when the Queue is implicitly passed to the child.
class Blerg():
    def __init__(self):
        self.q = multiprocessing.Queue(-1)

    def print_queue(self):
        print("Queue", self.q)

if __name__ == '__main__':

    blerg = Blerg()


    proc = multiprocessing.Process(target=blerg.print_queue)

$ python3 --version
Python 3.5.2

Windows (which defaults to "spawn" style multiprocessing) does not seem to have this issue (at least in 2.7.12).
Date User Action Args
2016-12-14 00:51:48Sean Murphysetrecipients: + Sean Murphy
2016-12-14 00:51:48Sean Murphysetmessageid: <>
2016-12-14 00:51:48Sean Murphylinkissue28965 messages
2016-12-14 00:51:48Sean Murphycreate