Message398436
A patch would not land in Python 3.9 since this would be a new feature and out-of-scope for a released version.
Do you really want to store gigabytes of downloads in RAM instead of doing chunked reads and store them on disk? If you cannot or don't want to write to disk, then there are easier and better ways to deal with large buffers. You could either mmap() or you could use a memoryview of a bytearray buffer:
buf = bytearray(4 * 1024**3)
view = memoryview(buf)
pos = 0
while True:
read = conn.recv_into(view[pos:pos+1048576])
if not read:
break
pos += read |
|
Date |
User |
Action |
Args |
2021-07-28 22:34:25 | christian.heimes | set | recipients:
+ christian.heimes, ronaldoussoren, methane, lukasz.langa, bmerry, jakirkham, matan1008, amacd31, jan-xyz |
2021-07-28 22:34:25 | christian.heimes | set | messageid: <1627511665.29.0.153622769.issue42853@roundup.psfhosted.org> |
2021-07-28 22:34:25 | christian.heimes | link | issue42853 messages |
2021-07-28 22:34:25 | christian.heimes | create | |
|