Spark ignores SPARK_WORKER_MEMORY?

前端 未结 4 934
梦如初夏
梦如初夏 2021-01-16 16:29

I\'m using standalone cluster mode, 1.5.2.

Even though I\'m setting SPARK_WORKER_MEMORY in spark-env.sh, it looks like this setting is igno

4条回答
  •  一整个雨季
    2021-01-16 17:07

    I've encountered the same problem as yours. The reason is that, in standalone mode, spark.executor.memory is actually ignored. What has an effect is spark.driver.memory, because the executor is living in the driver.

    So what you can do is to set spark.driver.memory as high as you want.

    This is where I've found the explanation: How to set Apache Spark Executor memory

提交回复
热议问题