Python dataclass from a nested dict

青春壹個敷衍的年華 提交于 2019-12-20 08:36:54

问题


The standard library in 3.7 can recursively convert a dataclass into a dict (example from the docs):

from dataclasses import dataclass, asdict
from typing import List

@dataclass
class Point:
     x: int
     y: int

@dataclass
class C:
     mylist: List[Point]

p = Point(10, 20)
assert asdict(p) == {'x': 10, 'y': 20}

c = C([Point(0, 0), Point(10, 4)])
tmp = {'mylist': [{'x': 0, 'y': 0}, {'x': 10, 'y': 4}]}
assert asdict(c) == tmp

I am looking for a way to turn a dict back into a dataclass when there is nesting. Something like C(**tmp) only works if the fields of the data class are simple types and not themselves dataclasses. I am familiar with jsonpickle, which however comes with a prominent security warning.


回答1:


Below is the CPython implementation of asdict – or specifically, the internal recursive helper function _asdict_inner that it uses:

# Source: https://github.com/python/cpython/blob/master/Lib/dataclasses.py

def _asdict_inner(obj, dict_factory):
    if _is_dataclass_instance(obj):
        result = []
        for f in fields(obj):
            value = _asdict_inner(getattr(obj, f.name), dict_factory)
            result.append((f.name, value))
        return dict_factory(result)
    elif isinstance(obj, tuple) and hasattr(obj, '_fields'):
        # [large block of author comments]
        return type(obj)(*[_asdict_inner(v, dict_factory) for v in obj])
    elif isinstance(obj, (list, tuple)):
        # [ditto]
        return type(obj)(_asdict_inner(v, dict_factory) for v in obj)
    elif isinstance(obj, dict):
        return type(obj)((_asdict_inner(k, dict_factory),
                          _asdict_inner(v, dict_factory))
                         for k, v in obj.items())
    else:
        return copy.deepcopy(obj)

asdict simply calls the above with some assertions, and dict_factory=dict by default.

How can this be adapted to create an output dictionary with the required type-tagging, as mentioned in the comments?


1. Adding type information

My attempt involved creating a custom return wrapper inheriting from dict:

class TypeDict(dict):
    def __init__(self, t, *args, **kwargs):
        super(TypeDict, self).__init__(*args, **kwargs)

        if not isinstance(t, type):
            raise TypeError("t must be a type")

        self._type = t

    @property
    def type(self):
        return self._type

Looking at the original code, only the first clause needs to be modified to use this wrapper, as the other clauses only handle containers of dataclass-es:

# only use dict for now; easy to add back later
def _todict_inner(obj):
    if is_dataclass_instance(obj):
        result = []
        for f in fields(obj):
            value = _todict_inner(getattr(obj, f.name))
            result.append((f.name, value))
        return TypeDict(type(obj), result)

    elif isinstance(obj, tuple) and hasattr(obj, '_fields'):
        return type(obj)(*[_todict_inner(v) for v in obj])
    elif isinstance(obj, (list, tuple)):
        return type(obj)(_todict_inner(v) for v in obj)
    elif isinstance(obj, dict):
        return type(obj)((_todict_inner(k), _todict_inner(v))
                         for k, v in obj.items())
    else:
        return copy.deepcopy(obj)

Imports:

from dataclasses import dataclass, fields, is_dataclass

# thanks to Patrick Haugh
from typing import *

# deepcopy 
import copy

Functions used:

# copy of the internal function _is_dataclass_instance
def is_dataclass_instance(obj):
    return is_dataclass(obj) and not is_dataclass(obj.type)

# the adapted version of asdict
def todict(obj):
    if not is_dataclass_instance(obj):
         raise TypeError("todict() should be called on dataclass instances")
    return _todict_inner(obj)

Tests with the example dataclasses:

c = C([Point(0, 0), Point(10, 4)])

print(c)
cd = todict(c)

print(cd)
# {'mylist': [{'x': 0, 'y': 0}, {'x': 10, 'y': 4}]}

print(cd.type)
# <class '__main__.C'>

Results are as expected.


2. Converting back to a dataclass

The recursive routine used by asdict can be re-used for the reverse process, with some relatively minor changes:

def _fromdict_inner(obj):
    # reconstruct the dataclass using the type tag
    if is_dataclass_dict(obj):
        result = {}
        for name, data in obj.items():
            result[name] = _fromdict_inner(data)
        return obj.type(**result)

    # exactly the same as before (without the tuple clause)
    elif isinstance(obj, (list, tuple)):
        return type(obj)(_fromdict_inner(v) for v in obj)
    elif isinstance(obj, dict):
        return type(obj)((_fromdict_inner(k), _fromdict_inner(v))
                         for k, v in obj.items())
    else:
        return copy.deepcopy(obj)

Functions used:

def is_dataclass_dict(obj):
    return isinstance(obj, TypeDict)

def fromdict(obj):
    if not is_dataclass_dict(obj):
        raise TypeError("fromdict() should be called on TypeDict instances")
    return _fromdict_inner(obj)

Test:

c = C([Point(0, 0), Point(10, 4)])
cd = todict(c)
cf = fromdict(cd)

print(c)
# C(mylist=[Point(x=0, y=0), Point(x=10, y=4)])

print(cf)
# C(mylist=[Point(x=0, y=0), Point(x=10, y=4)])

Again as expected.




回答2:


I'm the author of dacite - the tool that simplifies creation of data classes from dictionaries.

This library has only one function from_dict - this is a quick example of usage:

from dataclasses import dataclass
from dacite import from_dict

@dataclass
class User:
    name: str
    age: int
    is_active: bool

data = {
    'name': 'john',
    'age': 30,
    'is_active': True,
}

user = from_dict(data_class=User, data=data)

assert user == User(name='john', age=30, is_active=True)

Moreover dacite supports following features:

  • nested structures
  • (basic) types checking
  • optional fields (i.e. typing.Optional)
  • unions
  • collections
  • values casting and transformation
  • remapping of fields names

... and it's well tested - 100% code coverage!

To install dacite, simply use pip (or pipenv):

$ pip install dacite



回答3:


You can use mashumaro for creating dataclass object from a dict according to the scheme. Mixin from this library adds convenient from_dict and to_dict methods to dataclasses:

from dataclasses import dataclass
from typing import List
from mashumaro import DataClassDictMixin

@dataclass
class Point(DataClassDictMixin):
     x: int
     y: int

@dataclass
class C(DataClassDictMixin):
     mylist: List[Point]

p = Point(10, 20)
tmp = {'x': 10, 'y': 20}
assert p.to_dict() == tmp
assert Point.from_dict(tmp) == p

c = C([Point(0, 0), Point(10, 4)])
tmp = {'mylist': [{'x': 0, 'y': 0}, {'x': 10, 'y': 4}]}
assert c.to_dict() == tmp
assert C.from_dict(tmp) == c



回答4:


All it takes is a five-liner:

def dataclass_from_dict(klass, d):
    try:
        fieldtypes = {f.name:f.type for f in dataclasses.fields(klass)}
        return klass(**{f:dataclass_from_dict(fieldtypes[f],d[f]) for f in d})
    except:
        return d # Not a dataclass field

Sample usage:

from dataclasses import dataclass, asdict

@dataclass
class Point:
    x: float
    y: float

@dataclass
class Line:
    a: Point
    b: Point

line = Line(Point(1,2), Point(3,4))
assert line == dataclass_from_dict(Line, asdict(line))

Full code, including to/from json, here at gist: https://gist.github.com/gatopeich/1efd3e1e4269e1e98fae9983bb914f22




回答5:


If your goal is to produce JSON from and to existing, predefined dataclasses, then just write custom encoder and decoder hooks. Do not use dataclasses.asdict() here, instead record in JSON a (safe) reference to the original dataclass.

jsonpickle is not safe because it stores references to arbitrary Python objects and passes in data to their constructors. With such references I can get jsonpickle to reference internal Python data structures and create and execute functions, classes and modules at will. But that doesn't mean you can't handle such references unsafely. Just verify that you only import (not call) and then verify that the object is an actual dataclass type, before you use it.

The framework can be made generic enough but still limited only to JSON-serialisable types plus dataclass-based instances:

import dataclasses
import importlib
import sys

def dataclass_object_dump(ob):
    datacls = type(ob)
    if not dataclasses.is_dataclass(datacls):
        raise TypeError(f"Expected dataclass instance, got '{datacls!r}' object")
    mod = sys.modules.get(datacls.__module__)
    if mod is None or not hasattr(mod, datacls.__qualname__):
        raise ValueError(f"Can't resolve '{datacls!r}' reference")
    ref = f"{datacls.__module__}.{datacls.__qualname__}"
    fields = (f.name for f in dataclasses.fields(ob))
    return {**{f: getattr(ob, f) for f in fields}, '__dataclass__': ref}

def dataclass_object_load(d):
    ref = d.pop('__dataclass__', None)
    if ref is None:
        return d
    try:
        modname, hasdot, qualname = ref.rpartition('.')
        module = importlib.import_module(modname)
        datacls = getattr(module, qualname)
        if not dataclasses.is_dataclass(datacls) or not isinstance(datacls, type):
            raise ValueError
        return datacls(**d)
    except (ModuleNotFoundError, ValueError, AttributeError, TypeError):
        raise ValueError(f"Invalid dataclass reference {ref!r}") from None

This uses JSON-RPC-style class hints to name the dataclass, and on loading this is verified to still be a data class with the same fields. No type checking is done on the values of the fields (as that's a whole different kettle of fish).

Use these as the default and object_hook arguments to json.dump[s]() and json.dump[s]():

>>> print(json.dumps(c, default=dataclass_object_dump, indent=4))
{
    "mylist": [
        {
            "x": 0,
            "y": 0,
            "__dataclass__": "__main__.Point"
        },
        {
            "x": 10,
            "y": 4,
            "__dataclass__": "__main__.Point"
        }
    ],
    "__dataclass__": "__main__.C"
}
>>> json.loads(json.dumps(c, default=dataclass_object_dump), object_hook=dataclass_object_load)
C(mylist=[Point(x=0, y=0), Point(x=10, y=4)])
>>> json.loads(json.dumps(c, default=dataclass_object_dump), object_hook=dataclass_object_load) == c
True

or create instances of the JSONEncoder and JSONDecoder classes with those same hooks.

Instead of using fully qualifying module and class names, you could also use a separate registry to map permissible type names; check against the registry on encoding, and again on decoding to ensure you don't forget to register dataclasses as you develop.




回答6:


undictify is a library which could be of help. Here is a minimal usage example:

import json
from dataclasses import dataclass
from typing import List, NamedTuple, Optional, Any

from undictify import type_checked_constructor


@type_checked_constructor(skip=True)
@dataclass
class Heart:
    weight_in_kg: float
    pulse_at_rest: int


@type_checked_constructor(skip=True)
@dataclass
class Human:
    id: int
    name: str
    nick: Optional[str]
    heart: Heart
    friend_ids: List[int]


tobias_dict = json.loads('''
    {
        "id": 1,
        "name": "Tobias",
        "heart": {
            "weight_in_kg": 0.31,
            "pulse_at_rest": 52
        },
        "friend_ids": [2, 3, 4, 5]
    }''')

tobias = Human(**tobias_dict)


来源:https://stackoverflow.com/questions/53376099/python-dataclass-from-a-nested-dict

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!