Can I transform a complex json object to multiple rows in a dataframe in Azure Databricks using pyspark?

后端 未结 1 1074
后悔当初
后悔当初 2021-01-27 06:43

I have some json that\'s being read from a file where each row looks something like this:

    {
        \"id\": \"someGu         


        
相关标签:
1条回答
  • 2021-01-27 07:23

    You can try from_json function to convert the column/field from StructType into MapType, explode and then find your desired fields. for you example JSON, you will need to do this several times:

    from pyspark.sql.functions import explode, from_json, to_json, json_tuple, coalesce
    
    df.select(explode(from_json(to_json('data.data.players'),"map<string,string>"))) \
      .select(json_tuple('value', 'locationId', 'id', 'name', 'assets', 'dict').alias('Location', 'Player_ID', 'Player', 'assets', 'dict')) \
      .select('*', explode(from_json(coalesce('assets','dict'),"map<string,struct<isActive:boolean,playlists:string>>"))) \
      .selectExpr(
        'Location',
        'Player_ID',
        'Player', 
        'key as Asset_ID',
        'value.isActive',  
        'explode(from_json(value.playlists, "map<string,string>")) as (Playlist_ID, Playlist_Status)'
      ) \
    .show()
    +--------+---------+--------+--------+--------+------------+---------------+
    |Location|Player_ID|  Player|Asset_ID|isActive| Playlist_ID|Playlist_Status|
    +--------+---------+--------+--------+--------+------------+---------------+
    |someGuid| player_1|someName|assetId1|    true|     someId1|           true|
    |someGuid| player_1|someName|assetId1|    true|someOtherId1|          false|
    |someGuid| player_1|someName|assetId2|    true|     someId1|           true|
    |someGuid| player_2|someName|assetId3|    true|     someId1|           true|
    |someGuid| player_2|someName|assetId3|    true|someOtherId1|          false|
    |someGuid| player_2|someName|assetId4|    true|     someId1|           true|
    +--------+---------+--------+--------+--------+------------+---------------+
    
    0 讨论(0)
提交回复
热议问题