This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author lukasz.langa
Recipients Arfrever, barry, benjamin.peterson, eric.snow, hodgestar, lukasz.langa, pitrou, python-dev, rhettinger, robagar, serhiy.storchaka
Date 2017-09-12.14:09:52
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <1505225392.1.0.407749246452.issue16251@psf.upfronthosting.co.za>
In-reply-to
Content
> What the problem tries to solve PR 3508?

The two test cases added demonstrate what was impossible to pickle before and is now.

> Swallowing all exceptions looks like an antipattern to me.

This is the only thing that we can do faced with custom `__getattr__()` implementations, especially when `copy.deepcopy()` creates new objects with `cls.__new__()`, something that most class implementers won't expect.

> Rather than failing and allowing the programmer to fix the picleability of its class, this can silently produce incorrect pickle data.

Can you give me an example where this would lead to incorrect pickle data?

> By swallowing MemoryError and RecursionError this changes the behavior of objects in an environment with limited resources: lack of memory or calling deep in the calling stack. This adds heisenbugs.

This is what `hasattr()` in Python 2 did.  This is why in Python 2 the `RecursionError` example I added to the tests was actually working just fine.
History
Date User Action Args
2017-09-12 14:09:52lukasz.langasetrecipients: + lukasz.langa, barry, rhettinger, pitrou, benjamin.peterson, hodgestar, Arfrever, python-dev, eric.snow, serhiy.storchaka, robagar
2017-09-12 14:09:52lukasz.langasetmessageid: <1505225392.1.0.407749246452.issue16251@psf.upfronthosting.co.za>
2017-09-12 14:09:52lukasz.langalinkissue16251 messages
2017-09-12 14:09:52lukasz.langacreate