Spark: WholeTextFileRDD() causes out-of-memory java-heap-space error

前端 未结 0 1173
借酒劲吻你
借酒劲吻你 2020-11-30 15:42

I use WholeTextFileRDD() to load two files, 900Mb in total. My PC\'s in-memory is 8GB, my cluster\'s in-memory is 1TB, however the same error occurred as the following log.<

相关标签:
回答
  • 消灭零回复
提交回复
热议问题