This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Title: Sixth element of tuple from __reduce__(), inconsistency between pickle and copy
Type: behavior Stage:
Components: Library (Lib) Versions: Python 3.11, Python 3.10, Python 3.9
Status: open Resolution:
Dependencies: Superseder:
Assigned To: Nosy List: iritkatriel, lev.bishop, pitrou, rhettinger, serhiy.storchaka
Priority: normal Keywords:

Created on 2022-01-10 21:41 by lev.bishop, last changed 2022-04-11 14:59 by admin.

File name Uploaded Description Edit lev.bishop, 2022-01-10 21:41
Messages (2)
msg410256 - (view) Author: Lev Bishop (lev.bishop) Date: 2022-01-10 21:41
As discussed in discord thread where guido suggested to open this issue.

Both the pickle and copy modules of the standard library make use of a class’s __reduce__() method for customizing their pickle/copy process. They seem to have a consistent view of the first 5 elements of the returned tuple:
(func, args, state, listiter, dictiter) but the 6th element seems different. For pickle it’s state_setter , a callable with signature state_setter(obj, state)->None , but for copy it’s deepcopy with signature deepcopy(arg: T, memo) -> T .

This seems to be unintentional, since the pickle documentation states:

> As we shall see, pickle does not use directly the methods described
> above. In fact, these methods are part of the copy protocol which 
> implements the __reduce__() special method. The copy protocol provides
> a unified interface for retrieving the data necessary for pickling 
> and copying objects

It seems like in order to make a class definition for __reduce__() returning all 6 elements, then the __reduce__() would have to do something very awkward like examining its call stack in order to determine if it is being called in pickle or copy context in order to return an appropriate callable? (Naively providing the same callable in both contexts would cause errors for one or the other).

I attach a test file which defines two classes making use of a __reduce__() returning a 6 element tuple. One class Pickleable can be duplicated via pickling, but not deepcopied. The converse is true for the Copyable class.

Other than the 6th element of the tuple returned from __reduce__() the classes are identical.

Guido dug into the history and found that: 
> it looks like these are independent developments:

> the 6th arg for deepcopy was added 6 years ago via Issue 26167: Improve copy.copy speed for built-in types (list/set/dict) - Python tracker
> the 6th arg for pickle was adde 3 years ago via Issue 35900: Add pickler hook for the user to customize the serialization of user defined functions and types. - Python tracker

> I’m guessing the folks doing the latter weren’t aware that deepcopy already uses the 6th arg. Sorting this out will be painful.
msg410262 - (view) Author: Irit Katriel (iritkatriel) * (Python committer) Date: 2022-01-10 22:33
I added Serhiy as the author of the deepcopy optimization. Although it was the first to use the 6th item, it is not documented so I wonder if it's the easier of the two to change.
Date User Action Args
2022-04-11 14:59:54adminsetgithub: 90494
2022-01-10 22:33:13iritkatrielsetnosy: + iritkatriel
messages: + msg410262
2022-01-10 22:23:21iritkatrielsetnosy: + serhiy.storchaka
2022-01-10 22:19:51eric.araujosetnosy: + rhettinger, pitrou

versions: + Python 3.9, Python 3.10, Python 3.11
2022-01-10 21:41:04lev.bishopcreate