This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author josh.r
Recipients JanKanis, benjamin.peterson, dlesco, docs@python, hynek, josh.r, lemburg, pitrou, stutzbach, terry.reedy
Date 2014-07-04.00:48:39
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1404434920.04.0.948257323929.issue21910@psf.upfronthosting.co.za>
In-reply-to
Content
+1. I've been assuming writelines handled arbitrary generators without an issue; guess I've gotten lucky and only used the ones that do. I've fed stuff populated by enormous (though not infinite) generators created from stuff like itertools.product and the like into it on the assumption that it would safely write it without generating len(seq) ** repeat values in memory.

I'd definitely appreciate a documented guarantee of this. I don't need it to explicitly guarantee that each item is written before the next item is pulled off the iterator or anything; if it wants to buffer a reasonable amount of data in memory before triggering a real I/O that's fine (generators returning mutable objects and mutating them when the next object comes along are evil anyway, and forcing one-by-one output can prevent some useful optimizations). But anything that uses argument unpacking, collection as a list, ''.join (or at the C level, PySequence_Fast and the like), forcing the whole generator to exhaust before writing byte one, is a bad idea.
History
Date User Action Args
2014-07-04 00:48:40josh.rsetrecipients: + josh.r, lemburg, terry.reedy, pitrou, benjamin.peterson, stutzbach, dlesco, docs@python, JanKanis, hynek
2014-07-04 00:48:40josh.rsetmessageid: <1404434920.04.0.948257323929.issue21910@psf.upfronthosting.co.za>
2014-07-04 00:48:40josh.rlinkissue21910 messages
2014-07-04 00:48:39josh.rcreate