Message202779
Running the following task using concurrent.futures.ThreadPoolExecutor works with max_workers == 1 and fails when max_workers > 1 :
def task():
dirname = tempfile.mkdtemp()
f_w = open(os.path.join(dirname, "stdout.txt"), "w")
f_r = open(os.path.join(dirname, "stdout.txt"), "r")
e_w = open(os.path.join(dirname, "stderr.txt"), "w")
e_r = open(os.path.join(dirname, "stderr.txt"), "r")
with subprocess.Popen("dir", shell=True, stdout=f_w, stderr=e_w) as p:
pass
f_w.close()
f_r.close()
e_w.close()
e_r.close()
shutil.rmtree(dirname)
The exception is raised by shutil.rmtree when max_workers > 1: "The process cannot access the file because it is being used by another process"
See also this Stack Overflow question about what seems to bee a similar problem: http://stackoverflow.com/questions/15966418/python-popen-on-windows-with-multithreading-cant-delete-stdout-stderr-logs The discussion on SO indicates that this might be an XP problem only.
The attached file reproduces the problem on my Windows XP VM. |
|
Date |
User |
Action |
Args |
2013-11-13 21:02:53 | Bernt.Røskar.Brenna | set | recipients:
+ Bernt.Røskar.Brenna |
2013-11-13 21:02:53 | Bernt.Røskar.Brenna | set | messageid: <1384376573.19.0.824450606517.issue19575@psf.upfronthosting.co.za> |
2013-11-13 21:02:53 | Bernt.Røskar.Brenna | link | issue19575 messages |
2013-11-13 21:02:52 | Bernt.Røskar.Brenna | create | |
|