This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: add to dict fails after 1,000,000 items on py 2.7.5
Type: behavior Stage:
Components: Interpreter Core Versions: Python 2.7
process
Status: closed Resolution: not a bug
Dependencies: Superseder:
Assigned To: Nosy List: Elizacat, miltmobley, mrabarnett, pitrou, rhettinger, tim.peters, vstinner
Priority: normal Keywords:

Created on 2013-10-30 04:43 by miltmobley, last changed 2022-04-11 14:57 by admin. This issue is now closed.

Messages (9)
msg201705 - (view) Author: Milton Mobley (miltmobley) Date: 2013-10-30 04:43
d, i = {}, 0
while (i < 10000000):
    n =  i + 1
    d[n] = n
    i += 1

On Py 2.7.5 (windows7, x64, 4GB ram) this program slowed down obviously after passing 1,000,000 adds and never completed or raised an exception.
Windows performance monitor showed it was only claiming about 1 MB memory per second, and stopped claiming more memory at about 646MB. The only other large memory consumer in the system was JRE using about 550 MB on behalf of Eclipse.
msg201706 - (view) Author: Elizabeth Myers (Elizacat) * Date: 2013-10-30 04:58
Your test case, nor the one I wrote, trigger Python 2.7.5 nor 3.3.2 on Linux 64-bit:

try: xrange
except: xrange = range

d = {}
r = 10000000
ctr = 0
for n in xrange(r):
   d[n] = n
   ctr += 1

assert len(d) == r
assert ctr == r
msg201707 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2013-10-30 05:00
Works for me -- on 64-bit OS X 10.9 running Python 2.7.5
msg201708 - (view) Author: Tim Peters (tim.peters) * (Python committer) Date: 2013-10-30 05:10
Just another data point:  runs fine on Vista, 32-bit box, Python 2.7.5.  Python is consuming about 320MB when the dict is done building.
msg201731 - (view) Author: STINNER Victor (vstinner) * (Python committer) Date: 2013-10-30 13:20
It works for me on Linux 64-bit:

$ python
Python 2.7.3 (default, Aug  9 2012, 17:23:57) 
[GCC 4.7.1 20120720 (Red Hat 4.7.1-5)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> d, i = {}, 0
>>> while (i < 10000000):
...     n =  i + 1
...     d[n] = n
...     i += 1
... 
>>> import os
>>> os.system("grep VmRSS /proc/%s/status" % os.getpid())
VmRSS:	  637984 kB
0

"On Py 2.7.5 (windows7, x64, 4GB ram) this program slowed down obviously after passing 1,000,000 adds and never completed or raised an exception."

What is the exception? How much free memory do you have? If Python has not enough memory, Windows will probably starts to move memory to the disk and the system will becomes slower and slower.
msg201767 - (view) Author: Milton Mobley (miltmobley) Date: 2013-10-30 19:03
I followed the suggestion of email responders to use xrange instead of while, and observed that 32-bit Suse Linux got past 44,000,000 adds before exiting with "Memory Error", while 64-bit Windows 7 slowed down markedly after 22,000,000 adds and was unusable after 44,000,000 adds. However, the program did not stop or raise an exception, which is a concern. The size of the dict was 1.6 GB at that level. My current suspicion is that Windows is not doing a good job of pushing memory already allocated by the process to the virtual file system as the process continues to request more memory. But in my opinion Python should be able to detect failure to complete an allocation request on Windows, and raise an appropriate exception, as it does for Linux.
msg201768 - (view) Author: Matthew Barnett (mrabarnett) * (Python triager) Date: 2013-10-30 19:09
Works for me: Python 2.7.5, 64-bit, Windows 8.1
msg201769 - (view) Author: Antoine Pitrou (pitrou) * (Python committer) Date: 2013-10-30 19:11
> But in my opinion Python should be able to detect failure to complete an allocation request on Windows

Which failure? You're telling us it doesn't fail, it just becomes slow.
(by the way, have you checked whether your machine is swapping when that happens?)
msg201847 - (view) Author: Milton Mobley (miltmobley) Date: 2013-10-31 20:32
I now believe the problem of slow execution is caused by performance of the Windows 7 page file, and not by a Python bug. Others reported that similar tests worked on Windows 8.1 and various Linux systems. So I request to close or withdraw the Python "bug".
History
Date User Action Args
2022-04-11 14:57:52adminsetgithub: 63642
2013-10-31 20:40:57benjamin.petersonsetstatus: open -> closed
resolution: not a bug
2013-10-31 20:32:25miltmobleysetmessages: + msg201847
2013-10-30 19:11:58pitrousetnosy: + pitrou
messages: + msg201769
2013-10-30 19:09:45mrabarnettsetnosy: + mrabarnett
messages: + msg201768
2013-10-30 19:03:05miltmobleysetmessages: + msg201767
2013-10-30 13:20:40vstinnersetnosy: + vstinner
messages: + msg201731
2013-10-30 05:10:16tim.peterssetnosy: + tim.peters
messages: + msg201708
2013-10-30 05:00:40rhettingersetnosy: + rhettinger
messages: + msg201707
2013-10-30 04:58:32Elizacatsetnosy: + Elizacat
messages: + msg201706
2013-10-30 04:43:03miltmobleycreate