Pyspark job processing JSON in S3 - Out of memory error - Java heap space

前端 未结 0 1359
说谎
说谎 2021-01-01 23:48

My Pyspark job running on AWS EMR cluster with m5.24x large instances ( memory - 384 GB, VCores - 96) - 28 nodes are on the cluster. (total memory - 10.2 TB

相关标签:
回答
  • 消灭零回复
提交回复
热议问题