This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author belopolsky
Recipients belopolsky, rhettinger, spiv
Date 2010-05-11.16:11:01
SpamBayes Score 0.0009787596
Marked as misclassified No
Message-id <>
I have two problems with this proposal:

1. In constrained memory environments, creating a temporary internal copy of a large set may cause the difference operation to fail that would otherwise succeed.

2. The break-even point between extra lookups and a copy is likely to be different on different systems or even on the same system under different loads.

Programs that suffer from poor large_set.difference(small_set) performance can be rewritten as large_set_copy = large_set.copy(); large_set_copy.difference_updste(small_set) or even simply as large_set.difference_updste(small_set) if program logic allows it.
Date User Action Args
2010-05-11 16:11:04belopolskysetrecipients: + belopolsky, rhettinger, spiv
2010-05-11 16:11:04belopolskysetmessageid: <>
2010-05-11 16:11:01belopolskylinkissue8685 messages
2010-05-11 16:11:01belopolskycreate