Is there better way to display entire Spark SQL DataFrame?

后端 未结 7 704
轻奢々
轻奢々 2021-01-30 20:53

I would like to display the entire Apache Spark SQL DataFrame with the Scala API. I can use the show() method:

myDataFrame.show(Int.MaxValue)
         


        
7条回答
  •  遥遥无期
    2021-01-30 21:48

    As others suggested, printing out entire DF is bad idea. However, you can use df.rdd.foreachPartition(f) to print out partition-by-partition without flooding driver JVM (y using collect)

提交回复
热议问题