We\'re considering using Python (IronPython, but I don\'t think that\'s relevant) to provide a sort of \'macro\' support for another application, which controls a piece of equip
I would use some kind of decorator:
class TypeProtector(object):
def __init__(self, fun, types):
self.fun, self.types = fun, types
def __call__(self, *args, **kwargs)
# validate args with self.types
pass
# run function
return fun(*args, **kwargs)
def types(*args):
def decorator(fun):
# validate args count with fun parameters count
pass
# return covered function
return TypeProtector(fun, args)
return decorator
@types(Time, Temperature)
def myfunction(foo, bar):
pass
myfunction('21:21', '32C')
print myfunction.types
Decorators are a good way to add metadata to functions. Add one that takes a list of types to append to a .params property or something:
def takes(*args):
def _takes(fcn):
fcn.params = args
return fcn
return _takes
@takes("time", "temp", "time")
def do_stuff(start_time, average_temp, stop_time):
pass
For python 2.x, I like to use the docstring
def my_func(txt):
"""{
"name": "Justin",
"age" :15
}"""
pass
and it can be automatically assign to the function object with this snippet
for f in globals():
if not hasattr(globals()[f], '__call__'):
continue
try:
meta = json.loads(globals()[f].__doc__)
except:
continue
for k, v in meta.items():
setattr(globals()[f], k, v)
The 'pythonic' way to do this are function annotations.
def DoSomething(critical_temp: "temperature", time: "time")
pass