This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Unsupported provider

classification
Title: New class special method lookup change
Type: Stage:
Components: Documentation Versions: Python 3.0, Python 2.6
process
Status: closed Resolution: fixed
Dependencies: Superseder:
Assigned To: ncoghlan Nosy List: PiDelport, Rhamphoryncus, barry, georg.brandl, gvanrossum, jcea, jkrukoff, kundeng06, ncoghlan, rhettinger, terry.reedy
Priority: critical Keywords: patch

Created on 2002-11-25 23:37 by terry.reedy, last changed 2022-04-10 16:05 by admin. This issue is now closed.

Files
File name Uploaded Description Edit
typetools.rst ncoghlan, 2008-05-25 13:04 Documention for the typetools module
typetools.py ncoghlan, 2008-05-25 13:05 Implementation of ProxyMixin class
test_typetools.py ncoghlan, 2008-05-25 13:05 Unit test file for ProxyMixin
issue643841_typetools.diff ncoghlan, 2008-06-05 14:16 Patch to add the typetools module
proxymixin.diff ncoghlan, 2008-06-11 14:20 Cleaned up version of patch
special_method_lookup_docs.diff ncoghlan, 2008-08-04 12:13 Updates to data model section of language reference
Messages (47)
msg60293 - (view) Author: Terry J. Reedy (terry.reedy) * (Python committer) Date: 2002-11-25 23:37
The lookup of special methods invoked implicitly by 
syntax other than explicit instance.attribute changed 
to *not* use __gettattr__ when normal lookup failed.  
This is contrary to docs, which consistently say 
__getattr__ is unchanged.  New special method 
__getattribute__ is also bypassed, contrary to 
implication of doc-essay.

This was reported on c.l.p by Jan Decaluwe using 
2.2.2 on Red Hat Linux.  On Win98, 2.2.1, I get same 
behavior (I added test of __getattribute__):

class Iold:
  def __init__(self,ob):
    self.ob = ob
  def __getattr__(self,name):
    return getattr(self.ob,name)

class Inew(object):
  def __init__(self,ob):
    self.ob = ob
  def __getattr__(self,name):
    return getattr(self.ob,name)

a = Iold(1) #2
b = Inew(1) #2
a.__add__(1) #2
b.__add__(1) #2
a+1 #2
b+1 #error
#TypeError: unsupported operand types for +: 'Inew' 
and 'int'
Inew.__getattribute__ = Inew.__getattr__
b+1 #same error, no loop
#TypeError: unsupported operand types for +: 'Inew' 
and 'int'

b.__add__(1) # WARNING starts 'infinite' loop

def _(self,other): print 'hello'

Inew.__add__ = _
b+1 #prints 'hello', __getattribute__ bypassed.

http://python.org/2.2/descrintro.html says:
"Note that while in general operator overloading 
works just as for classic classes, there are some 
differences. (The biggest one is the lack of support for 
__coerce__; new-style classes should always use the 
new-style numeric API, which passes the other 
operand uncoerced to the __add__ and __radd__ 
methods, etc.) "

Was lookup change meant to be one of differences?

"There's a new way of overriding attribute access. The 
__getattr__ hook, if defined, works the same way as it 
does for classic classes: it is only called if the regular 
way of searching for the attribute doesn't find it."

But it is different.

 "But you can now also override __getattribute__, a 
new operation that is called for all attribute 
references." 

Except for implicit special methods.

I did not classify discrepancy because I don't know 
whether interpreter or docs are off.




msg60294 - (view) Author: Terry J. Reedy (terry.reedy) * (Python committer) Date: 2002-11-26 04:25
Logged In: YES 
user_id=593130

Print-style 'debugging' output provided by Bengt Richter in 
a follow-up in the c.l.p. thread "Issue with new-style 
classes and operators" showed that 'a+1' worked because 
of a __getattr__ call getting '__coerce__' (rather 
than '__add__') and that 'b+1' did not trigger such a call.  
So I presume the quoted parenthesized statement about 
__coerce__ and 'new-style numeric API' was meant to 
explain as least this part of the change in behavior.  
However, it is still not clear to me, even after reading the 
development (2.3a) version of the ref manual, why failure 
to find '__add__' in 'the usual places' (to quote RefMan 
3.3.2 __getattr__ entry) does not trigger a call to 
__getattr__, nor why the initial attempt to find it did not 
trigger __getattribute__.  The sentence 'For objects x and y, 
first x.__op__(y) is tried' (3.3.7) implies to me that there is 
an attempt to find __op__ as an attribute of x.  Unless I 
missed something I should have found, a bit more 
clarification might be usefully added.
msg60295 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2004-09-27 08:54
Logged In: YES 
user_id=1038590

FWIW, this is still true for Python 2.4 (i.e. the search for
'__add__' by '+' invokes neither __getattr__ nor
__getattribute__).

The 'infinite loop' is courtesy of the call to 'self.ob'
inside __getattr__.

Also of interest - int(b) fails with a TypeError, and str(b)
returns the generic object representation of b, rather than
delegating to the contained object.

So it looks like these methods are not invoked when looking
up any 'magic methods' recognised by the interpreter.
Inspection of PyNumber_Add shows that this is the case - the
objects' method pointers are inspected directly at the C level.

To me, this looks like a documentation bug. Magic methods
are found by checking if the associated slot in the method
structure is not NULL, rather than by actually looking for
the relevant magic 'attribute'.

In order to change the 'implicit' behaviour of the object,
it is necessary to change the contents of the underlying
method slot - a task which is carried out by type whenever
an attribute is set, or when a new instance of type is created.



msg64641 - (view) Author: John Krukoff (jkrukoff) Date: 2008-03-28 17:54
I was just bit by this today in converting a proxy class from old style
to new style. The official documentation was of no help in discoverting
that neither __getattr__ or __getattribute__ are used to look up magic
attribute names. Even the link to "New-style Classes" off the
development documentation page is useless, as none of the resources
there (http://www.python.org/doc/newstyle/) mention the incompatible change.

This seems like an issue that is going to come up more frequently as
python 3000 pushes everyone to using only new style classes. It'd be
very useful if whatever conversion tool we get, or the python 3000
standard library includes a proxy class or metaclass that is able to
help with this conversion, such as this one:
http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/252151

Though preferably with some knowledge of all exising magic names.
msg64678 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-03-29 03:47
The 2.6 and 3.0 documentation has already been updated appropriately.
The forward compatible way to handle this is to inherit from object and
override the special methods explicitly in the 2.x version (Yes this can
make writing proxy objects more tedious, but from our experience with
the tempfile module, I would say that the bulk of proxy objects that
aren't overriding special methods on a case-by-case basis are probably
broken anyway).

It may be appropriate to add a -3 warning that is triggered whenever a
special method is retrieved via __getattr__ on a classic class.
msg64813 - (view) Author: John Krukoff (jkrukoff) Date: 2008-04-01 15:50
I assume when you say that the documentation has already been updated, 
you mean something other than what's shown at:
http://docs.python.org/dev/reference/datamodel.html#new-style-and-
classic-classes
or
http://docs.python.org/dev/3.0/reference/datamodel.html#new-style-and-
classic-classes ?

As both of those claim to still not be up to date in relation to new 
style classes, and the __getattr__ & __getattribute__ sections under 
special names make no reference to how magic methods are handled. 
Additionally, the "Class Instances" section under the type heirachy 
makes mention of how attributes are looked up, but doesn't mention the 
new style differences even in the 3.0 documentation.

Sure, there's this note under "Special Method Names": 
For new-style classes, special methods are only guaranteed to work if 
defined in an object’s class, not in the object’s instance dictionary. 

But that only helps you figure it out if you already know what the 
problem is, and it's hardly comprehensive.

I'm not arguing that this is something that's going to change, as we're 
way past the point of discussion on the implementation, but this looks 
far more annoying if you're looking at it from the perspective of 
proxying to container classes or numeric types in a generic fashion. My 
two use cases I've had to convert are for lazy initialization of an 
object and for building an object that uses RPC to proxy access to an 
object to a remote server.

In both cases, since they are generic proxies that once initialized are 
supposed to behave exactly like the proxied instance, the list of magic 
methods to pass along is ridiculously long. Sure, I have to handle 
__copy__ & __deepcopy__, and __getstate__ & __setstate__ to make sure 
that they return instances of the proxy rather than the proxied object, 
but other than that there's over 50 attributes to override for new 
style classes just to handle proxying to numeric and container types. 

It's hard to get fancy about it too, as I can't just dynamically add 
the needed attributes to my instances by looking at the object to be 
proxied, it really has to be a static list of everything that python 
supports on the class. Additionally, while metaclasses might help here, 
there's still the problem that while my old style proxy class has 
continued to work fine as magic attributes have been added over python 
revisions, my new style equivalent will have to be updated work 
currectly as magic methods are added. Which, given the 2.x track seems 
to happen pretty frequently. Some of the bugs from that would have been 
quite tricky to track down too, such as the __cmp__ augmentation with 
the rich comparison methods.

None of the solutions really seem ideal, or at least as good as what 
old style classes provided, which is why I was hoping for some support 
in the 3.0 standard library or the conversion tool.
msg64853 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-04-02 13:17
Agreed that section of the docs should be more explicit in pointing out
that __getattr__ and __getattribute__ won't work for proxying special
methods for new-style classes.

It's also true that there are quite a few special methods where nothing
special is needed to correctly proxy the methods. That said, defining
all of the methods blindly is also bad (as a lot of informal type
testing is done by checking for certain special methods as attributes).

Perhaps it would be useful to have a method on type instances that could
be explicitly invoked to tell the type to run through all the tp_* slots
and populate them based on doing getattr(self, "__slotname__").

(Note that it's only those methods with dedicated slots in the C type
structure that are a problem here - those which are implemented solely
as normal Python methods like __enter__, __exit__ or the pickle module's
special methods can typically be overridden as expected through the use
of __getattr__ and __getattribute__)
msg64854 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-04-02 13:32
I've started a discussion on the Py3k development list regarding the
possibility of adding a new method to type to aid in converting proxy
classes from classic- to new-style.
msg65179 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-04-08 16:10
I spent an enlightening evening browsing through the source code for
weakref.proxy. The way that code works is to define every slot,
delegating to the proxied object to handle each call (wrapping and
unwrapping the proxied object as needed).

This is normally transparent to the user due to the fact that
__getattribute__ is one of the proxied methods (and at the C level, the
delegated slot invocations return NotImplemented or set the appropriate
exceptions). The only way it shows through is the fact that
operator.isNumber and operator.isMapping will always return True for the
proxy instance, and operator.isSequence will always return False - this
is due to the proxy type filling in the number and mapping slots, but
not the sequence slots.

However, this prompted me to try an experiment (Python 2.5.1), and the
results didn't fill me with confidence regarding the approach of
expecting 3rd party developers to explicitly delegate all of the special
methods:

>>> class Demo:
...   def __index__(self):
...     return 1
...
>>> a = Demo()
>>> b = weakref.proxy(a)
>>> operator.index(a)
1
>>> operator.index(b)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'weakproxy' object cannot be interpreted as an index

Oops.
msg65575 - (view) Author: Pi Delport (PiDelport) Date: 2008-04-17 05:44
Somewhat related: #2605 (descriptor __get__/__set__/__delete__)
msg65677 - (view) Author: John Krukoff (jkrukoff) Date: 2008-04-22 18:25
I've been following the py3k maliing list disscussion for this issue, 
and wanted to add a note about the proposed solution described here:
http://mail.python.org/pipermail/python-3000/2008-April/013004.html

The reason I think this approach is valuable is that in all of the 
proxy classes I've written, I'm concerned about which behaviour of the 
proxied class I want to override, not which behaviour I want to keep. 
In other words, when I proxy something, my mental model has always 
been, okay, I want something that behaves just like X, except it does 
this (usually small bit) differently.

This is also why I expect my proxies to keep working the same when I 
change the proxied class, without having to go and update the proxy to 
also use the new behaviour.

So, yeah, very much in favor of a base proxy class in the standard 
library.
msg67127 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-20 14:15
I've attached a sample ProxyBase class that delegates all of the special
methods handled by weakref.proxy (as well as the tp_oct and tp_hex
slots, which weakref.proxy ignores).

While there are other special methods in CPython (e.g. __enter__ and
__exit__), those don't bypass the normal lookup mechanism, so the
__getattribute__ override should handle them.

For lack of a better name, I called the module typetools (by analogy to
functools).

Note that correctly implementing a proxy class as a new-style class
definitely turns out to be a fairly non-trivial exercise (and since this
one is still sans-tests, I don't make any promises that even it is 100%
correct at this point)
msg67136 - (view) Author: Adam Olsen (Rhamphoryncus) Date: 2008-05-20 18:48
Is there any reason not to name it ProxyMixin, ala DictMixin?
msg67155 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-21 14:14
Attached a new version of the module, along with a unit test file. The
unit tests caught a bug in the __gt__ implementation. I've also changed
the name to ProxyMixin as suggested by Adam and switched to using a
normal __init__ method (there was no real gain from using __new__ instead).
msg67190 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-22 13:44
Added documentation, and assigned to Barry as release manager for 2.6/3.0.

Also bumped to 'release blocker' status because I think the loss of
classic classes transparent proxying capabilities is a fairly
substantial issue that needs to be addressed explicitly before the first
3.0 beta.

If I get the go-ahead from Barry or Guido, I'll add the new module to
2.6 (from whence it will be migrated to 3.0 as part of the normal merge
process).
msg67191 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-22 13:48
Also changed to a library issue instead of a docs issue.
msg67201 - (view) Author: Adam Olsen (Rhamphoryncus) Date: 2008-05-22 17:09
_deref won't work for remote objects, will it?  Nor _unwrap, although
that starts to get "fun".
msg67207 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-22 22:16
Correct, this isn't intended to be an all-singing, all-dancing proxy
implementation - it's meant to be a simple solution for local proxies
that want to change the behaviour of a few operations while leaving
other operations unaffected.

The proposed documentation I uploaded covers some of its limitations.
However, even for those cases, ProxyMixin and/or
test_typetools.TestProxyMixin can be used as a reference to make sure a
custom proxy implementation correctly handles all the special method
invocations that can bypass __getattribute__.
msg67209 - (view) Author: Adam Olsen (Rhamphoryncus) Date: 2008-05-22 22:48
If it's so specialized then I'm not sure it should be in the stdlib -
maybe as a private API, if there was a user.

Having a reference implementation is noble, but this isn't the right way
to do it.  Maybe as an example in Doc or in the cookbook.  Better yet,
add the unit test and define the ProxyMixin directly in that file.
msg67223 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-23 08:34
Specialised? What's specialised about it? It's designed to serve as a
replacement for the simple delegation use cases that are currently met
quite adequately by classic classes, since those are no longer available
in 3.0.
msg67224 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-23 08:49
As to the rationale for having it in the standard library: it's because
of the coupling with the implementation of the type() builtin. If a new
slot is added to type() that the interpreter may access without
consulting __getattribute__, then ProxyMixin can be updated to correctly
delegate that slot. If alternate implementations such as Jython or
IronPython have additional such slots, they can also provide their own
version of ProxyMixin.

If ProxyMixin doesn't become the development responsibility of the same
group that is responsible for the implementation of the type object then
it may as well not exist (since it can't be trusted to be kept up to
date with the development process).
msg67257 - (view) Author: Adam Olsen (Rhamphoryncus) Date: 2008-05-23 17:51
Surely remote proxies fall under what would be expected for a "proxy
mixin"?  If it's in the stdlib it should be a canonical implementation,
NOT a reference implementation.

At the moment I can think up 3 use cases:
* weakref proxies
* lazy load proxy
* distributed object

The first two could be done if _deref were made overridable.  The third
needs to turn everything into a message, which would could either do
directly, or we could do by turning everything into normal method
lookups which then get handled through __getattribute__.
msg67276 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-24 00:25
The way to override _deref/_unwrap is to make _target a property on a
ProxyMixin  subclass (which reminds me, I want to put in an explicit
test case to make sure that works as intended - I'll use the
weakref-proxy-in-Python as the example, so I'll also need to fix the
docs to indicate that such a thing doesn't actually require rewriting
the whole class). 

And there are a lot more use cases than just the ones you listed,
primarily in the area of interface adaptation (i.e. the programmer just
wants to fiddle with the visible API of the object a bit, rather than
doing anything clever with the way the proxy's target is referenced).

The reason I'm wary of attempting to provide direct support for the
distributed communications use case, is that distributed computing needs
to deal with all sorts of issues in relation to serialisation and
transport of arguments and return values, that a local proxy of any kind
simply doesn't have to deal with (since it can just pass direct
references around). Note that merely diverting everything through
__getattribute__ isn't even close to enough, due to the argument and
return value transport problem - the RPC mechanism needs to understand
the methods that are being invoked as well. So I'm quite happy leaving
all those issues to tools that are actually designed to handle them
(CORBA, XML-RPC, etc).
msg67346 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-05-25 13:04
Updated test cases and documentation to cover use of a descriptor for
_target in a subclass, and uploaded most recent local copies of all 3 files.
msg67712 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-05 14:16
I've attached the latest version of the module as an actual patch
against Python SVN.

It differs slightly from the last version uploaded as separate files, in
that in-place operations on a proxied object will no longer strip the
proxy wrapper off the object. Instead, either the same proxy object will
be returned if the target returned itself from the in-place operation
(mutable objects), or a new proxy wrapper around the result of the
target returned a different object (immutable objects).

Example with a mutable target:
>>> from typetools import ProxyMixin as p
>>> x = p([])
>>> last_x = x
>>> x += [1]
>>> x
<ProxyMixin for [1]>
>>> last_x
<ProxyMixin for [1]>

Example with an immutable target:
>>> from typetools import ProxyMixin as p
>>> x = p(1)
>>> last_x = x
>>> x += 1
>>> x
<ProxyMixin for 2>
>>> last_x
<ProxyMixin for 1>
msg67713 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-05 14:17
Note that I'll be offline for the next few days, so I want be able to
respond to any comments until some time next week.
msg67719 - (view) Author: Adam Olsen (Rhamphoryncus) Date: 2008-06-05 17:33
The inplace operators aren't right for weakref proxies.  If a new object
is returned there likely won't be another reference to it and the
weakref will promptly be cleared.

This could be fixed with another property like _target, which by default
type(self)(result).  Weakref proxies could override it to raise an
exception instead.
msg67744 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-06 01:58
Ah, that would answer my #XXX comment regarding that in the patch.
Agreed, the best answer will be to factor out the _rewrap operation into
a new class method. (Next week though...)
msg67948 - (view) Author: Barry A. Warsaw (barry) * (Python committer) Date: 2008-06-11 11:29
This is a huge thread and I don't have time to look at the entire patch.  

However, it seems like the main purpose of the proxy class is to work
around a basic deficiency in Python.  Now, if this is a purposeful
omissions (i.e. defined as part of the language), then a proxy class
makes sense.  If not, then you should probably work to fix the
implementation.

In general, the concept of a base proxy mixin makes sense if it's
generic enough and flexible enough to be of wider use to Python
programmers.  One measure of that might be to re-implement the existing
proxy-like classes to use this mixin class.  If that can't be done, then
this is too specialized (or too unproven) and should probably be a
cheeseshop module first.

I'm also uncomfortable with adding a new typestool module, mostly
because we have a types module.  I know they're there for different
purposes, but still, it seems ugly to me.  On top of that, adding a
module for a single class seems like overkill.

Do you have any ideas about where this might go in an existing module?

Overall, I'm -0 on adding this to Python.  Guido should have the final
say though.  I'm knocking this down to critical so it won't hold up the
betas.  Other than the refactoring, it seems like adding a proxy class,
while being a new feature, is isolated enough that it could go in after
beta.
msg67954 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-11 12:23
Unfortunately, the standard library doesn't tend to do this kind of
delegation (aside from weakref.proxy, which implements the equivalent
code directly in C), so there isn't a lot of standard library code that
will benefit from it directly.

However, maintaining such a class on PyPI is also fairly undesirable,
due to the extremely tight coupling between the list of methods it needs
to delegate explicitly and the tp_* slots in the PyType method
definitions - add a new tp_* slot, and it's necessary to add the same
methods to the proxy class. Different Python implementations are going
to have different needs as to which slots they have to delegate
explicitly, and which can be left to the __getattribute__ catch-all
delegation.

As far as adding a module for a single class goes, I wouldn't expect it
to remain that way forever. E.g., I'd hope to eventually see a
CallableMixin that defined __get__ the same way a function does, making
it easier to create callables that behave like functions when stored as
a class attribute.

That said, I'd be happy enough with adding the ProxyMixin to the types
module instead, but I thought we were still trying to get rid of that
module.
msg67961 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-11 12:51
And (mainly for Barry's benefit) a quick recap of why I think this is
necessary for Python 3.0:

For performance or correctness reasons, the interpreter is permitted to
bypass the normal __getattribute__ when looking up special methods such
as __print__ or __index__. Whether or not the normal attribute lookup
machinery is bypassed for a specific special method is an application
independent decision.

In CPython's case, this bypassing can occur either because there is a
tp_* slot dedicated to the method, or because the interpreter uses
Py_TYPE(obj) and _PyType_Lookup instead of PyObject_GetAttr to find the
method implementation (or type(obj).meth instead of obj.meth for special
method lookups implemented in Python code).

This behaviour creates a problem for value-based delegation such as that
provided by weakref.proxy: unlike overriding __getattr__ on a classic
class, merely overriding __getattribute__ on a new-style class instance
is insufficient to be able to correctly delegate all of the special methods.

The intent of providing a typetools.ProxyMixin (or alternatively a
types.ProxyMixin class) is to allow fairly simply conversion of classic
classes that implement value-based delegation to new-style classes by
inheriting from ProxyMixin rather than inheriting from object directly.

Given the close proximity of the beta perhaps I should PEP'ify this to
get a formal yea or nay from Guido? I haven't managed to get much
response to previous python-dev posts about it.
msg67962 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-11 12:57
bleh, "application independent decision" in my last post should read
"interpreter implementation dependent decision".
msg67963 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-11 13:01
and "__print__" was meant to be "__unicode__"...
msg67976 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-11 14:20
New patch (proxymixin.diff) uploaded that correctly delegates
__format__, as well as using an overridable return_inplace() method to
generate the inplace operation return values. The _target attribute has
also been made formally part of the public API (as 'target'), although
you obviously need to explicitly invoke object.__getattribute__ in order
for it to be visible. The name of the attribute is also available at the
module level as _PROXY_TARGET_ATTR.
msg67977 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-06-11 14:21
Note that I don't make any promises about the correctness of the ReST
formatting in that latest patch - my Doc build is misbehaving at the
moment, and I haven't had a chance to look at what is wrong with it.
msg67987 - (view) Author: Raymond Hettinger (rhettinger) * (Python committer) Date: 2008-06-11 15:37
The name Proxy seems too vague.  This class is all about targeted 
delegation.  Am curious, has this been out as a recipe; has it been 
used in combat yet?
msg67996 - (view) Author: Guido van Rossum (gvanrossum) * (Python committer) Date: 2008-06-11 16:59
I want to make this "bypass getattr" behavior mandatory for those
operations that currently use it, forcing the issue for other
implementations of Python.  That's a doc change (but an important one!).
 There are probably many random places where the docs imply that getattr
is used where it isn't.

I am not sure that we need a proxy implementation in the stdlib; usually
when proxying there is some intentional irregularity (that's why you're
proxying) and I'm not sure how useful the mix-in class will be in
practice.  We should wait and see how effective it is in some realistic
situations before accepting it into the stdlib.  Also, typetools strikes
me as a horrible name.
msg68001 - (view) Author: Barry A. Warsaw (barry) * (Python committer) Date: 2008-06-11 17:31
Thanks for the pronouncement Guido.  We will not let this issue hold up
the beta releases.
msg69386 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-07-07 13:03
The outcome of discussion of this issue on python-dev was that the
lookup methodology for the special methods needs to be better
documented, especially for those cases where the instance *must* be
bypassed in order to avoid metaclass confusion for special methods that
apply to both types and instances (see issue 2517 for such a problem
that currently afffects the lookup of __unicode__). However, we're not
prepared to add a standard delegation mixin to the standard library at
this stage (I may still add a cleaned up version of mine to the SVN
sandbox as an executable reference source for the relevant section of
the documentation though).

While I offered to write that new section of the docs during the
python-dev discussion, I'm not sure when I'll be able to get to it (My
Python time lately has mostly been spent investigating __hash__ fun and
games).
msg69675 - (view) Author: kundeng (kundeng06) Date: 2008-07-15 09:26
Hi, I was trying to implement a generic proxy class with some
restricting behaviors and then I bumped into this complication.  May I
ask what the conclusion is going to be in the long run? 

I guess my question is that 

1> is it just going to be a documentation change (first)?

2> is it technically difficult(other things are going to be broken
because of this change) or very inefficient to make the behavior of
special method lookup same as normal methods?

2> Otherwise, what is the rationale behind keeping the differences? 

thank you.
msg69676 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-07-15 09:52
There are both speed and correctness reasons for special methods being
looked up the way they are, so the behaviour isn't going to change.

Aside from providing additional details in the language reference on how
special methods are looked up (and the reasons things are done that
way), the only change that might eventually happen in this area is the
inclusion of a mixin class in the standard library to assist in writing
classes which delegate most operations to another object. That part
won't happen before 2.7/3.1 though (if it happens at all).
msg70694 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-08-04 12:13
Attaching a documentation patch for the moment until I get some info
back from Georg as to why I can't build the docs locally.

Once I get my local doc build working again, I'll check the formatting
and check it in.
msg70696 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-08-04 12:42
Committed for 2.6 as r65487.

I also blocked the automatic merge to 3.0 since the references to
old-style classes don't make sense there.
msg70699 - (view) Author: Georg Brandl (georg.brandl) * (Python committer) Date: 2008-08-04 12:55
But don't the docs with patch describe the behavior of new-style classes
better?
msg70700 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-08-04 13:29
I meant to say that I will be merging it manually to avoid bringing the
old-style class specific parts over (that's why I left the issue open
and assigned to me).
msg70701 - (view) Author: Georg Brandl (georg.brandl) * (Python committer) Date: 2008-08-04 14:22
Ah, I'm sorry for the noise then.
msg72206 - (view) Author: Nick Coghlan (ncoghlan) * (Python committer) Date: 2008-08-31 12:46
Docs updated for 3.0 in r66084 (and I was right in thinking the
automatic merge didn't have a hope of getting this right - there were
conflicts all the way through the file).

Closing this issue after a mere 5 years and 9 months - any requests for
better proxying/special method delegation support in the standard
library should be raised in a new tracker issue as a feature request for
2.7/3.1.
History
Date User Action Args
2022-04-10 16:05:56adminsetgithub: 37537
2008-08-31 12:46:05ncoghlansetstatus: open -> closed
resolution: fixed
messages: + msg72206
2008-08-04 14:22:59georg.brandlsetmessages: + msg70701
2008-08-04 13:29:00ncoghlansetmessages: + msg70700
2008-08-04 12:55:25georg.brandlsetmessages: + msg70699
2008-08-04 12:42:36ncoghlansetmessages: + msg70696
2008-08-04 12:14:02ncoghlansetcomponents: + Documentation, - Library (Lib)
2008-08-04 12:13:16ncoghlansetfiles: + special_method_lookup_docs.diff
assignee: ncoghlan
messages: + msg70694
2008-07-15 09:52:17ncoghlansetmessages: + msg69676
2008-07-15 09:26:44kundeng06setnosy: + kundeng06
messages: + msg69675
2008-07-07 13:03:01ncoghlansetassignee: barry -> (no value)
messages: + msg69386
2008-06-11 17:31:59barrysetmessages: + msg68001
2008-06-11 16:59:48gvanrossumsetnosy: + gvanrossum
messages: + msg67996
2008-06-11 15:38:04rhettingersetnosy: + rhettinger
messages: + msg67987
2008-06-11 14:21:35ncoghlansetmessages: + msg67977
2008-06-11 14:20:24ncoghlansetfiles: + proxymixin.diff
messages: + msg67976
2008-06-11 13:01:01ncoghlansetmessages: + msg67963
2008-06-11 12:57:29ncoghlansetmessages: + msg67962
2008-06-11 12:51:50ncoghlansetmessages: + msg67961
2008-06-11 12:23:23ncoghlansetmessages: + msg67954
2008-06-11 11:30:25barrysetpriority: release blocker -> critical
2008-06-11 11:29:54barrysetmessages: + msg67948
2008-06-06 01:58:52ncoghlansetmessages: + msg67744
2008-06-05 17:33:12Rhamphoryncussetmessages: + msg67719
2008-06-05 14:17:59ncoghlansetmessages: + msg67713
2008-06-05 14:16:38ncoghlansetfiles: + issue643841_typetools.diff
messages: + msg67712
2008-05-25 13:07:52ncoghlansetfiles: - typetools.rst
2008-05-25 13:07:46ncoghlansetfiles: - test_typetools.py
2008-05-25 13:07:40ncoghlansetfiles: - typetools.py
2008-05-25 13:05:51ncoghlansetfiles: + test_typetools.py
2008-05-25 13:05:25ncoghlansetfiles: + typetools.py
2008-05-25 13:04:52ncoghlansetkeywords: + patch
files: + typetools.rst
messages: + msg67346
versions: + Python 3.0, - Python 2.5, Python 2.4, Python 2.3, Python 2.2.3, Python 2.2.2, Python 2.2.1
2008-05-24 00:26:11ncoghlansetmessages: + msg67276
2008-05-23 17:51:12Rhamphoryncussetmessages: + msg67257
2008-05-23 08:49:50ncoghlansetmessages: + msg67224
2008-05-23 08:34:10ncoghlansetmessages: + msg67223
2008-05-22 23:14:27jceasetnosy: + jcea
2008-05-22 22:48:25Rhamphoryncussetmessages: + msg67209
2008-05-22 22:16:58ncoghlansetmessages: + msg67207
2008-05-22 17:09:05Rhamphoryncussetmessages: + msg67201
2008-05-22 13:48:32ncoghlansetmessages: + msg67191
components: + Library (Lib), - Documentation
2008-05-22 13:44:56ncoghlansetpriority: normal -> release blocker
assignee: georg.brandl -> barry
messages: + msg67190
files: + typetools.rst
nosy: + barry, - jcea
2008-05-22 13:36:58jceasetnosy: + jcea
2008-05-21 14:15:46ncoghlansetfiles: + test_typetools.py
2008-05-21 14:14:56ncoghlansetfiles: - typetools.py
2008-05-21 14:14:45ncoghlansetfiles: + typetools.py
messages: + msg67155
2008-05-20 18:49:02Rhamphoryncussetnosy: + Rhamphoryncus
messages: + msg67136
2008-05-20 14:16:00ncoghlansetfiles: + typetools.py
messages: + msg67127
2008-04-22 18:25:29jkrukoffsetmessages: + msg65677
2008-04-17 05:44:20PiDelportsetnosy: + PiDelport
messages: + msg65575
2008-04-08 16:10:13ncoghlansetmessages: + msg65179
2008-04-02 13:32:22ncoghlansetmessages: + msg64854
2008-04-02 13:17:17ncoghlansetmessages: + msg64853
2008-04-01 15:50:27jkrukoffsetmessages: + msg64813
2008-03-29 03:47:28ncoghlansetmessages: + msg64678
2008-03-28 17:54:50jkrukoffsetassignee: georg.brandl
messages: + msg64641
nosy: + georg.brandl, jkrukoff
components: + Documentation, - None
versions: + Python 2.6, Python 2.5, Python 2.4, Python 2.3, Python 2.2.3, Python 2.2.2, Python 2.2.1
2002-11-25 23:37:25terry.reedycreate