This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author Bernt.Røskar.Brenna
Recipients Bernt.Røskar.Brenna
Date 2013-11-13.21:02:52
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1384376573.19.0.824450606517.issue19575@psf.upfronthosting.co.za>
In-reply-to
Content
Running the following task using concurrent.futures.ThreadPoolExecutor works with max_workers == 1 and fails when max_workers > 1 :

def task():
    dirname = tempfile.mkdtemp()
    f_w =  open(os.path.join(dirname, "stdout.txt"), "w")
    f_r = open(os.path.join(dirname, "stdout.txt"), "r")
    e_w =  open(os.path.join(dirname, "stderr.txt"), "w")
    e_r = open(os.path.join(dirname, "stderr.txt"), "r")

    with subprocess.Popen("dir", shell=True, stdout=f_w, stderr=e_w) as p:
        pass

    f_w.close()
    f_r.close()
    e_w.close()
    e_r.close()

    shutil.rmtree(dirname)

The exception is raised by shutil.rmtree when max_workers > 1: "The process cannot access the file because it is being used by another process"

See also this Stack Overflow question about what seems to bee a similar problem: http://stackoverflow.com/questions/15966418/python-popen-on-windows-with-multithreading-cant-delete-stdout-stderr-logs  The discussion on SO indicates that this might be an XP problem only.

The attached file reproduces the problem on my Windows XP VM.
History
Date User Action Args
2013-11-13 21:02:53Bernt.Røskar.Brennasetrecipients: + Bernt.Røskar.Brenna
2013-11-13 21:02:53Bernt.Røskar.Brennasetmessageid: <1384376573.19.0.824450606517.issue19575@psf.upfronthosting.co.za>
2013-11-13 21:02:53Bernt.Røskar.Brennalinkissue19575 messages
2013-11-13 21:02:52Bernt.Røskar.Brennacreate