Spark configuration, what is the difference of SPARK_DRIVER_MEMORY, SPARK_EXECUTOR_MEMORY, and SPARK_WORKER_MEMORY?

后端 未结 1 1409
醉酒成梦
醉酒成梦 2021-02-04 20:32

I did my work, read the documentation at https://spark.apache.org/docs/latest/configuration.html

in spark-folder/conf/spark-env.sh:

1条回答
  •  余生分开走
    2021-02-04 21:10

    First, you should know that 1 Worker (you can say 1 machine or 1 Worker Node) can launch multiple Executors (or multiple Worker Instances - the term they use in the docs).

    • SPARK_WORKER_MEMORY is only used in standalone deploy mode
    • SPARK_EXECUTOR_MEMORY is used in YARN deploy mode

    In Standalone mode, you set SPARK_WORKER_MEMORY to the total amount of memory can be used on one machine (All Executors on this machine) to run your spark applications.

    In contrast, In YARN mode, you set SPARK_DRIVER_MEMORY to the memory of one Executor

    • SPARK_DRIVER_MEMORY is used in YARN deploy mode, specifying the memory for the Driver that runs your application & communicates with Cluster Manager.

    0 讨论(0)
提交回复
热议问题