Author Aaron.Sherman
Recipients Aaron.Sherman
Date 2011-02-24.23:22:42
SpamBayes Score 2.13962e-06
Marked as misclassified No
Message-id <>
I wrote some code a while back which used os.popen. I recently got a warning about popen being deprecated so I tried a test with the new subprocess module. In that test, subprocess.Popen appears to have a 40% process creation overhead penalty over os.popen, which really isn't small. It seems that the difference may be made up of some heavy mmap-related work that's happening in my version of python, and that might be highly platform specific, but the mmap/mremap/munmap calls being made in my sample subprocess code aren't being made at all in the os.popen equivalent.

Now, before someone says, "process creation is trivial, so a 40% hit is acceptable because it's 40% of a trivial part of your execution time," keep in mind that many Python applications are heavily process-creation focused. In my case that means monitoring, but I could also imagine this having a substantial impact on Web services and other applications that spend almost all of their time creating child processes. For a trivial script, subprocess is fine as is, but for these demanding applications, subprocess represents a significant source of pain.

Anyway my testing, results and conclusions are written up here:
Date User Action Args
2011-02-24 23:22:44Aaron.Shermansetrecipients: + Aaron.Sherman
2011-02-24 23:22:44Aaron.Shermansetmessageid: <>
2011-02-24 23:22:42Aaron.Shermanlinkissue11314 messages
2011-02-24 23:22:42Aaron.Shermancreate