I\'m using standalone cluster mode, 1.5.2.
Even though I\'m setting SPARK_WORKER_MEMORY
in spark-env.sh
, it looks like this setting is igno
I've encountered the same problem as yours. The reason is that, in standalone mode, spark.executor.memory
is actually ignored. What has an effect is spark.driver.memory
, because the executor is living in the driver.
So what you can do is to set spark.driver.memory
as high as you want.
This is where I've found the explanation: How to set Apache Spark Executor memory