pyspark-job-processing-json-in-s3 - out-of-memory-error-java-heap-space

前端 未结 0 846
醉酒成梦
醉酒成梦 2020-12-29 10:44

Pyspark job running on AWS EMR cluster with m5.24x large instances ( memory - 384 GB, VCores - 96) - 28 nodes are on the cluster. (total memory - 10.2 TB)

The Job nee

相关标签:
回答
  • 消灭零回复
提交回复
热议问题