Optimizing Spark resources to avoid memory and space usage

后端 未结 0 388
没有蜡笔的小新
没有蜡笔的小新 2021-01-24 09:43

I have a dataset that is around 190GB that was partitioned into 1000 partitions.

my EMR cluster allows a maximum of 10 r5a.2xlarge TASK nodes and 2 CORE node

相关标签:
回答
  • 消灭零回复
提交回复
热议问题