Pyspark Dataframe number of rows too large, how to avoid failure trying to count()?

后端 未结 0 1748
攒了一身酷
攒了一身酷 2021-01-07 04:33

Trying to run some spark jobs.. when df.count() gets called, I get the following stack trace:

Starting the job...

Starting job: count at N         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题