This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author christian.heimes
Recipients amacd31, bmerry, christian.heimes, jakirkham, jan-xyz, lukasz.langa, matan1008, methane, ronaldoussoren
Date 2021-07-28.22:34:25
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1627511665.29.0.153622769.issue42853@roundup.psfhosted.org>
In-reply-to
Content
A patch would not land in Python 3.9 since this would be a new feature and out-of-scope for a released version.

Do you really want to store gigabytes of downloads in RAM instead of doing chunked reads and store them on disk? If you cannot or don't want to write to disk, then there are easier and better ways to deal with large buffers. You could either mmap() or you could use a memoryview of a bytearray buffer:

   buf = bytearray(4 * 1024**3)
   view = memoryview(buf)
   pos = 0
   while True:
       read = conn.recv_into(view[pos:pos+1048576])
       if not read:
           break
       pos += read
History
Date User Action Args
2021-07-28 22:34:25christian.heimessetrecipients: + christian.heimes, ronaldoussoren, methane, lukasz.langa, bmerry, jakirkham, matan1008, amacd31, jan-xyz
2021-07-28 22:34:25christian.heimessetmessageid: <1627511665.29.0.153622769.issue42853@roundup.psfhosted.org>
2021-07-28 22:34:25christian.heimeslinkissue42853 messages
2021-07-28 22:34:25christian.heimescreate