I\'m trying to dynamically build a row in pySpark 1.6.1, then build it into a dataframe. The general idea is to extend the results of describe to include, for exam
In case the dict is not flatten, you can convert dict to Row recursively.
def as_row(obj):
if isinstance(obj, dict):
dictionary = {k: as_row(v) for k, v in obj.items()}
return Row(**dictionary)
elif isinstance(obj, list):
return [as_row(v) for v in obj]
else:
return obj