问题
I have a method that takes (among others) a dictionary as an argument. The method is parsing strings and the dictionary provides replacements for some substrings, so it doesn't have to be mutable.
This function is called quite often, and on redundant elements so I figured that caching it would improve its efficiency.
But, as you may have guessed, since dict
is mutable and thus not hashable, @functools.lru_cache
can't decorate my function. So how can I overcome this?
Bonus point if it needs only standard library classes and methods. Ideally if it exists some kind of frozendict
in standard library that I haven't seen it would make my day.
PS: namedtuple
only in last resort, since it would need a big syntax shift.
回答1:
Instead of using a custom hashable dictionary, use this and avoid reinventing the wheel! It's a frozen dictionary that's all hashable.
https://pypi.org/project/frozendict/
Code:
def freezeargs(func):
"""Transform mutable dictionnary
Into immutable
Useful to be compatible with cache
"""
@functools.wraps(func)
def wrapped(*args, **kwargs):
args = tuple([frozendict(arg) if isinstance(arg, dict) else arg for arg in args])
kwargs = {k: frozendict(v) if isinstance(v, dict) else v for k, v in kwargs.items()}
return func(*args, **kwargs)
return wrapped
and then
@freezeargs
@lru_cache
def func(...):
pass
Code taken from @fast_cen 's answer
Note: this does not work on recursive datastructures; for example, you might have an argument that's a list, which is unhashable. You are invited to make the wrapping recursive, such that it goes deep into the data structure and makes every dict
frozen and every list
tuple.
(I know that OP nolonger wants a solution, but I came here looking for the same solution, so leaving this for future generations)
回答2:
What about creating a hashable dict
class like so:
class HDict(dict):
def __hash__(self):
return hash(frozenset(self.items()))
substs = HDict({'foo': 'bar', 'baz': 'quz'})
cache = {substs: True}
回答3:
Here is a decorator that use @mhyfritz trick.
def hash_dict(func):
"""Transform mutable dictionnary
Into immutable
Useful to be compatible with cache
"""
class HDict(dict):
def __hash__(self):
return hash(frozenset(self.items()))
@functools.wraps(func)
def wrapped(*args, **kwargs):
args = tuple([HDict(arg) if isinstance(arg, dict) else arg for arg in args])
kwargs = {k: HDict(v) if isinstance(v, dict) else v for k, v in kwargs.items()}
return func(*args, **kwargs)
return wrapped
Simply add it before your lru_cache.
@hash_dict
@functools.lru_cache()
def your_function():
...
回答4:
How about subclassing namedtuple
and add access by x["key"]
?
class X(namedtuple("Y", "a b c")):
def __getitem__(self, item):
if isinstance(item, int):
return super(X, self).__getitem__(item)
return getattr(self, item)
回答5:
Here's a decorator that can be used like functools.lru_cache
. But this is targetted at functions that take only one argument which is a flat mapping with hashable values and has a fixed maxsize
of 64. For your use-case you would have to adapt either this example or your client code. Also, to set the maxsize
individually, one had to implement another decorator, but i haven't wrapped my head around this since i don't needed it.
from functools import (_CacheInfo, _lru_cache_wrapper, lru_cache,
partial, update_wrapper)
from typing import Any, Callable, Dict, Hashable
def lru_dict_arg_cache(func: Callable) -> Callable:
def unpacking_func(func: Callable, arg: frozenset) -> Any:
return func(dict(arg))
_unpacking_func = partial(unpacking_func, func)
_cached_unpacking_func = \
_lru_cache_wrapper(_unpacking_func, 64, False, _CacheInfo)
def packing_func(arg: Dict[Hashable, Hashable]) -> Any:
return _cached_unpacking_func(frozenset(arg.items()))
update_wrapper(packing_func, func)
packing_func.cache_info = _cached_unpacking_func.cache_info
return packing_func
@lru_dict_arg_cache
def uppercase_keys(arg: dict) -> dict:
""" Yelling keys. """
return {k.upper(): v for k, v in arg.items()}
assert uppercase_keys.__name__ == 'uppercase_keys'
assert uppercase_keys.__doc__ == ' Yelling keys. '
assert uppercase_keys({'ham': 'spam'}) == {'HAM': 'spam'}
assert uppercase_keys({'ham': 'spam'}) == {'HAM': 'spam'}
cache_info = uppercase_keys.cache_info()
assert cache_info.hits == 1
assert cache_info.misses == 1
assert cache_info.maxsize == 64
assert cache_info.currsize == 1
assert uppercase_keys({'foo': 'bar'}) == {'FOO': 'bar'}
assert uppercase_keys({'foo': 'baz'}) == {'FOO': 'baz'}
cache_info = uppercase_keys.cache_info()
assert cache_info.hits == 1
assert cache_info.misses == 3
assert cache_info.currsize == 3
For a more generic approach one could use the decorator @cachetools.cache from a third-party library with an appropriate function set as key
.
回答6:
After deciding to drop lru cache for our use case for now, we still came up with a solution. This decorator uses json to serialise and deserialise the args/kwargs sent to the cache. Works with any number of args. Use it as a decorator on a function instead of @lru_cache. max size is set to 1024.
def hashable_lru(func):
cache = lru_cache(maxsize=1024)
def deserialise(value):
try:
return json.loads(value)
except Exception:
return value
def func_with_serialized_params(*args, **kwargs):
_args = tuple([deserialise(arg) for arg in args])
_kwargs = {k: deserialise(v) for k, v in kwargs.items()}
return func(*_args, **_kwargs)
cached_function = cache(func_with_serialized_params)
@wraps(func)
def lru_decorator(*args, **kwargs):
_args = tuple([json.dumps(arg, sort_keys=True) if type(arg) in (list, dict) else arg for arg in args])
_kwargs = {k: json.dumps(v, sort_keys=True) if type(v) in (list, dict) else v for k, v in kwargs.items()}
return cached_function(*_args, **_kwargs)
lru_decorator.cache_info = cached_function.cache_info
lru_decorator.cache_clear = cached_function.cache_clear
return lru_decorator
来源:https://stackoverflow.com/questions/6358481/using-functools-lru-cache-with-dictionary-arguments