This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author tim.peters
Recipients bquinlan, diogocp, paul.moore, sbt, steve.dower, terry.reedy, tim.golden, tim.peters, zach.ware
Date 2016-05-07.18:23:02
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1462645383.19.0.137793545849.issue26903@psf.upfronthosting.co.za>
In-reply-to
Content
Just noting that the `multiprocessing` module can be used instead.  In the example, add

    import multiprocessing as mp

and change

        with concurrent.futures.ProcessPoolExecutor() as executor:

to

        with mp.Pool() as executor:

That's all it takes.  On my 4-core Win10 box (8 logical cores), that continued to work fine even when passing 1024 to mp.Pool() (although it obviously burned time and RAM to create over a thousand processes).

Some quick Googling strongly suggests there's no reasonably general way to overcome the Windows-defined MAXIMUM_WAIT_OBJECTS=64 for implementations that call the Windows WaitForMultipleObjects().
History
Date User Action Args
2016-05-07 18:23:03tim.peterssetrecipients: + tim.peters, terry.reedy, paul.moore, bquinlan, tim.golden, sbt, zach.ware, steve.dower, diogocp
2016-05-07 18:23:03tim.peterssetmessageid: <1462645383.19.0.137793545849.issue26903@psf.upfronthosting.co.za>
2016-05-07 18:23:03tim.peterslinkissue26903 messages
2016-05-07 18:23:02tim.peterscreate