How to deal with executor memory and driver memory in Spark?

后端 未结 2 1562
旧时难觅i
旧时难觅i 2021-01-30 02:17

I am confused about dealing with executor memory and driver memory in Spark.

My environment settings are as below:

  • Memory 128 G, 16 CPU for 9 VM
  • C
2条回答
  •  醉梦人生
    2021-01-30 02:47

    In a Spark Application, Driver is responsible for task scheduling and Executor is responsible for executing the concrete tasks in your job.

    If you are familiar with MapReduce, your map tasks & reduce tasks are all executed in Executor(in Spark, they are called ShuffleMapTasks & ResultTasks), and also, whatever RDD you want to cache is also in executor's JVM's heap & disk.

    So I think a few GBs will just be OK for your Driver.

提交回复
热议问题