This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

classification
Title: multiprocessing AuthenticationError "digest sent was rejected" when pickling proxy
Type: behavior Stage: resolved
Components: Library (Lib) Versions: Python 2.7
process
Status: closed Resolution: not a bug
Dependencies: Superseder:
Assigned To: sbt Nosy List: Dhanannjay.Deo, Paul.Tunison, davin, jnoller, ned.deily, nirai, peterhunt, sbt, xhantu
Priority: normal Keywords:

Created on 2009-12-14 07:53 by peterhunt, last changed 2022-04-11 14:56 by admin. This issue is now closed.

Files
File name Uploaded Description Edit
multiprocessing_bug.py peterhunt, 2009-12-14 07:53
Messages (8)
msg96371 - (view) Author: Pete Hunt (peterhunt) Date: 2009-12-14 07:53
When pickling a proxy object (such as a list) created on a client, an 
exception is thrown. Example attached.

Python 2.6.4 Mac OS X 10.6

p
msg96372 - (view) Author: Pete Hunt (peterhunt) Date: 2009-12-14 08:04
UPDATE: this example WORKS if you remove "authkey" - so it seems to be a 
problem with authentication.
msg147657 - (view) Author: (xhantu) Date: 2011-11-15 08:26
Confirmed for Python 2.7.1 on Ubuntu.

Problematic are the __reduce__ methods of multiprocessing.process.AuthenticationString and multiprocessing.managers.BaseProxy. Pickling of an authkey in BaseProxy is only done and allowed when Popen.thread_is_spawning() is True. The comments state this is done because of security reasons.

If you pass proxies with multiprocessing after process initialization then no authkey is available after unpickle, a random authkey will be used with results in an digest error. Because of this fallback in case no authkey is available, using None as authkey didn't work in my case.

This bug looks like a design flaw of multiprocessing. If security was really a concern, encryption should be used along with authentication for communication. And passing None as authkey should disable authentication as stated in the documentation and should not use a random authkey.

Disabling the Popen.thread_is_spawning() checks allows passing of proxies. I have done this during runtime by replacing the __reduce__ methods of the affected classes to avoid patching the installed python.
msg147658 - (view) Author: (xhantu) Date: 2011-11-15 08:27
forgot to set version in classification
msg212303 - (view) Author: Ned Deily (ned.deily) * (Python committer) Date: 2014-02-26 20:50
[Note, due to a bug tracker error now fixed, s couple of comments made to this issue earlier today were dropped.  I'm manually adding them here on behalf of the original submitters.]

At Wed Feb 26 17:05:01 CET 2014, Paul Tunison added the comment:

I can confirm that this is still an issue with python 2.7.5. My method of resolving this locally is similar to xhantu's. I created a sub-class of BaseProxy, overriding the __reduce__ method and injecting the authkey into the appropriate spot in the super method's returned content.
msg212304 - (view) Author: Ned Deily (ned.deily) * (Python committer) Date: 2014-02-26 20:52
At Wed Feb 26 17:09:49 CET 2014, Dhanannjay Deo added the comment:

Confirmed for python 2.7.3 on ubuntu 12.04 lts. Why this issue is still open after 4 years ?
msg214582 - (view) Author: Richard Oudkerk (sbt) * (Python committer) Date: 2014-03-23 14:00
For reasons we all know unpickling unauthenticated data received over TCP is very risky.  Sending an unencrypted authentication key (as part of a pickle) over TCP would make the authentication useless.

When a proxy is pickled the authkey is deliberately dropped.  When the proxy is unpickled the authkey used for the reconstructed proxy is current_process().authkey.  So you can "fix" the example by setting the current_process().authkey to match the one used by the manager:

import multiprocessing
from multiprocessing import managers
import pickle

class MyManager(managers.SyncManager):
    pass

def client():
    mgr = MyManager(address=("localhost",2288),authkey="12345")
    mgr.connect()
    l = mgr.list()
    multiprocessing.current_process().authkey = "12345"    # <--- HERE
    l = pickle.loads(pickle.dumps(l))

def server():
    mgr = MyManager(address=("",2288),authkey="12345")
    mgr.get_server().serve_forever()

server = multiprocessing.Process(target=server)
client = multiprocessing.Process(target=client)
server.start()
client.start()
client.join()
server.terminate()
server.join()

In practice all processes using the manager should have current_process().authkey set to the same value.

I don't claim that multiprocessing supports distributed computing very well, but as far as I can see, things are working as intended.
msg237796 - (view) Author: Davin Potts (davin) * (Python committer) Date: 2015-03-10 18:54
Per Richard's post from 1 year ago where he offers both explanation and an example, ultimately concluding that things are working as intended, and because there have been no objections or requests for clarification in the past year, closing this and marking it as not a bug.
History
Date User Action Args
2022-04-11 14:56:55adminsetgithub: 51752
2015-03-10 18:54:17davinsetstatus: open -> closed

nosy: + davin
messages: + msg237796

resolution: not a bug
stage: resolved
2014-03-23 14:00:31sbtsetmessages: + msg214582
2014-02-28 13:55:36sbtsetassignee: sbt
2014-02-26 20:52:01ned.deilysetnosy: + Dhanannjay.Deo

messages: + msg212304
versions: - Python 2.6
2014-02-26 20:50:37ned.deilysetnosy: + ned.deily, sbt, Paul.Tunison
messages: + msg212303
2011-11-15 08:27:40xhantusetmessages: + msg147658
versions: + Python 2.7
2011-11-15 08:26:08xhantusetnosy: + xhantu
messages: + msg147657
2009-12-25 07:46:05niraisetnosy: + nirai
2009-12-14 12:42:14r.david.murraysetpriority: normal
type: crash -> behavior
2009-12-14 12:41:22r.david.murraysetnosy: + jnoller
2009-12-14 08:04:07peterhuntsettype: crash
messages: + msg96372
2009-12-14 07:53:53peterhuntcreate