How read json with duplicated columns via spark?

后端 未结 0 1099
野性不改
野性不改 2021-02-05 02:11

I am reading json by spark there is nothing special just:

spark.read.option(\'compression\', \'gzip\').option(\'dropFieldIfAllNull\', True)\\
.json(source_final)
         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题