I wrote code like this
>>> class a(object):
def __init__(self):
self.__call__ = lambda x:x
>>> b = a()
I
What about this? Define a base class AllowDynamicCall:
class AllowDynamicCall(object):
def __call__(self, *args, **kwargs):
return self._callfunc(self, *args, **kwargs)
And then subclass AllowDynamicCall:
class Example(AllowDynamicCall):
def __init__(self):
self._callfunc = lambda s: s
Special methods are looked up on the type (e.g., class) of the object being operated on, not on the specific instance. Think about it: otherwise, if a class defines __call__
for example, when the class is called that __call__
should get called... what a disaster! But fortunately the special method is instead looked up on the class's type, AKA metaclass, and all is well ("legacy classes" had very irregular behavior in this, which is why we're all better off with the new-style ones -- which are the only ones left in Python 3).
So if you need "per-instance overriding" of special methods, you have to ensure the instance has its own unique class. That's very easy:
class a(object):
def __init__(self):
self.__class__ = type(self.__class__.__name__, (self.__class__,), {})
self.__class__.__call__ = lambda x:x
and you're there. Of course that would be silly in this case, as every instance ends up with just the same "so-called per-instance" (!) __call__
, but it would be useful if you really needed overriding on a per-individual-instance basis.
__call__
needs to be defined on the class, not the instance
class a(object):
def __init__(self):
pass
__call__ = lambda x:x
but most people probably find it more readable to define the method the usual way
class a(object):
def __init__(self):
pass
def __call__(self):
return self
If you need to have different behaviour for each instance you could do it like this
class a(object):
def __init__(self):
self.myfunc = lambda x:x
def __call__(self):
return self.myfunc(self)