Performance issue with pyspark job

后端 未结 0 920
时光说笑
时光说笑 2021-02-02 04:36

I am using pyspark / spark sql for performing very simple tasks. Data size is very less, highest being 215 MB. 90% of the data sources sizes are less than 15 MB. We do filterin

相关标签:
回答
  • 消灭零回复
提交回复
热议问题