Message213027
I was more interested in the import time, and it has slightly increased. Memory consumption ((32**2+2*85**2)*sys.getsizeof(b'xx')/2**10 + sys.getsizeof(dict.fromkeys(range(32**2))) + 2*sys.getsizeof(dict.fromkeys(range(85**2))) + 2*(85**2-256)*sys.getsizeof(85**2) < 1MB) was seemed small compared with the overall memory usage of Python (20-25MB), so I were ignored it.
No need for lazy initialization of small tables, and the overhead of the _init_*85_tables() call for every encoding/decoding operation may be too large. Here is simpler patch. |
|
Date |
User |
Action |
Args |
2014-03-10 11:26:55 | serhiy.storchaka | set | recipients:
+ serhiy.storchaka, loewis, pitrou, vstinner, Arfrever, r.david.murray |
2014-03-10 11:26:55 | serhiy.storchaka | set | messageid: <1394450815.12.0.355637158944.issue20879@psf.upfronthosting.co.za> |
2014-03-10 11:26:55 | serhiy.storchaka | link | issue20879 messages |
2014-03-10 11:26:54 | serhiy.storchaka | create | |
|