This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author lowell87
Recipients lowell87, mramahi77
Date 2008-12-30.22:54:46
SpamBayes Score 1.1609683e-06
Marked as misclassified No
Message-id <1230677688.04.0.0203819261483.issue4749@psf.upfronthosting.co.za>
In-reply-to
Content
I've ran into the same problem before.  I've found that due to
differences between the way Unix and Windows handles files (inodes vs
file handles), this problem is more apparent on Windows, but it isn't
handled 100% correctly on Unix systems either.

I think the root of the problem is that there is nothing in the code to
handle multiple concurrent processes writing to a single log file; and
I'm not sure there is a simple fix for this.  I tried several easy
solutions to this problem by retrying failed file renames and re-opening
closed files, but I ultimately discovered all all such approaches are
inadequate and can actually result in losing old log files too (in the
worse-case scenario).

I final got frustrated enough to just take the time to write my own.  It
is based on the built-in one and aims to be a "drop-in" replacement.  I
use file locking to safely write to a single log file from multiple
python processes concurrently.  If you would like to give it a try, it
is located here:
   http://pypi.python.org/pypi/ConcurrentLogHandler

I agree that it would be nice for the built in logging handlers to do
this for you, but in the mean time this may be option for you.
History
Date User Action Args
2008-12-30 22:54:48lowell87setrecipients: + lowell87, mramahi77
2008-12-30 22:54:48lowell87setmessageid: <1230677688.04.0.0203819261483.issue4749@psf.upfronthosting.co.za>
2008-12-30 22:54:47lowell87linkissue4749 messages
2008-12-30 22:54:46lowell87create