This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author rshura
Recipients
Date 2006-01-24.20:50:51
SpamBayes Score
Marked as misclassified
Message-id
In-reply-to
Content
Logged In: YES 
user_id=498357

With the SVN version of _bsddb.c I no longer have segfault
with my test. Instead I have the following exception:

Traceback (most recent call last):
  File "test2.py", line 37, in ?
   
person_map.associate(surnames,find_surname,db.DB_CREATE,txn=the_txn)
MemoryError: (12, 'Cannot allocate memory -- Lock table is
out of available locks')

Now, please bear with me here if you can. It's easy to shrug
it off saying that I simply don't have enough locks for this
huge txn. But the exact same code works fine with the
pm_ok.db file from my testcase, and that file has exact same
number of elements and exact same structure of both the data
and the secondary index computation. So one would think that
it needs exact same number of locks, and yet it works while
pm.db does not.

The only difference between the two data files is that in
each data item, data[0] is much larger in pm.db and smaller
in pm_ok.db

Is it remotely possible that the actual error has nothing to
do with locks but rather with the data size? What can I do
to find out or fix this?

Thanks for you help!
History
Date User Action Args
2007-08-23 14:37:25adminlinkissue1413192 messages
2007-08-23 14:37:25admincreate