I\'d like to find out the arity of a method in Python (the number of parameters that it receives). Right now I\'m doing this:
def arity(obj, method):
retur
Ideally, you'd want to monkey-patch the arity function as a method on Python functors. Here's how:
def arity(self, method):
return getattr(self.__class__, method).func_code.co_argcount - 1
functor = arity.__class__
functor.arity = arity
arity.__class__.arity = arity
But, CPython implements functors in C, you can't actually modify them. This may work in PyPy, though.
That's all assuming your arity() function is correct. What about variadic functions? Do you even want an answer then?
Module inspect
from Python's standard library is your friend -- see the online docs! inspect.getargspec(func)
returns a tuple with four items, args, varargs, varkw, defaults
: len(args)
is the "primary arity", but arity can be anything from that to infinity if you have varargs
and/or varkw
not None
, and some arguments may be omitted (and defaulted) if defaults
is not None
. How you turn that into a single number, beats me, but presumably you have your ideas in the matter!-)
This applies to Python-coded functions, but not to C-coded ones. Nothing in the Python C API lets C-coded functions (including built-ins) expose their signature for introspection, except via their docstring (or optionally via annotations in Python 3); so, you will need to fall back to docstring parsing as a last ditch if other approaches fail (of course, the docstring might be missing too, in which case the function will remain a mystery).
Use a decorator to decorate methods e.g.
def arity(method):
def _arity():
return method.func_code.co_argcount - 1 # remove self
method.arity = _arity
return method
class Foo:
@arity
def bar(self, bla):
pass
print Foo().bar.arity()
Now implement _arity
function to calculate arg count based on your needs
here is another attempt using metaclass, as i use python 2.5, but with 2.6 you could easily decorate the class
metaclass can also be defined at module level, so it works for all classes
from types import FunctionType
def arity(unboundmethod):
def _arity():
return unboundmethod.func_code.co_argcount - 1 # remove self
unboundmethod.arity = _arity
return unboundmethod
class AirtyMetaclass(type):
def __new__(meta, name, bases, attrs):
newAttrs = {}
for attributeName, attribute in attrs.items():
if type(attribute) == FunctionType:
attribute = arity(attribute)
newAttrs[attributeName] = attribute
klass = type.__new__(meta, name, bases, newAttrs)
return klass
class Foo:
__metaclass__ = AirtyMetaclass
def bar(self, bla):
pass
print Foo().bar.arity()
This is the only way that I can think of that should be 100% effective (at least with regard to whether the function is user-defined or written in C) at determining a function's (minimum) arity. However, you should be sure that this function won't cause any side-effects and that it won't throw a TypeError:
from functools import partial
def arity(func):
pfunc = func
i = 0
while True:
try:
pfunc()
except TypeError:
pfunc = partial(pfunc, '')
i += 1
else:
return i
def foo(x, y, z):
pass
def varfoo(*args):
pass
class klass(object):
def klassfoo(self):
pass
print arity(foo)
print arity(varfoo)
x = klass()
print arity(x.klassfoo)
# output
# 3
# 0
# 0
As you can see, this will determine the minimum arity if a function takes a variable amount of arguments. It also won't take into account the self or cls argument of a class or instance method.
To be totally honest though, I wouldn't use this function in a production environment unless I knew exactly which functions would be called though as there is a lot of room for stupid errors. This may defeat the purpose.