This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author illume
Recipients
Date 2006-06-30.06:02:38
SpamBayes Score
Marked as misclassified
Message-id
In-reply-to
Content
Logged In: YES 
user_id=2042


I have done some more tests... and it seems that
dictionaries do not release as much memory as lists do.


Here is a modification of the last example posted.

If you only let fill2() run almost all the memory is freed.
 fill2() uses lists.  However if you let the others which
use dicts run not all of the memory is freed.  The processes
are still 38MB when the data is del'ed.

It is still much better than python2.4, however I think
something fishy must be going on with dicts.





<pre>

import random
import os

def fill_this_one_doesnot_free_much_at_all(map):
    random.seed(0)
    for i in xrange(300000):
        k1 = (random.randrange(100),
              random.randrange(100),
              random.randrange(100))
        k2 = random.randrange(100)
        if k1 in map:
            d = map[k1]
        else:
            d = dict()
        d[k2] = d.get(k2, 0) + 1
        map[k1] = d


def fill(map):
    random.seed(0)
    for i in xrange(3000000):
        map[i] = "asdf"


class Obj:
    def __init__( self ):
        self.dumb = "hello"


def fill2(map):
    a = []
    for i in xrange(300000):
        o = Obj()
        a.append(o)
    return a


if __name__ == "__main__":
    import time
    import gc

    os.system('ps v | grep memtest')
    d = dict()
    a = fill2(d)
    #a2 = fill(d)
    a2 = fill_this_one_doesnot_free_much_at_all(d)
    print "-"* 50
    print "\n" * 3
    os.system('ps v | grep memtest')
    del d
    del a

    gc.collect()

    time.sleep(2)
    for x in xrange(100000):
        pass
    print "-"* 50
    print "\n" * 3
    os.system('ps v | grep memtest')
</pre>
History
Date User Action Args
2007-08-23 15:41:52adminlinkissue1123430 messages
2007-08-23 15:41:52admincreate