Spark Standalone Number Executors/Cores Control

前端 未结 1 662
梦毁少年i
梦毁少年i 2020-11-27 21:37

So I have a spark standalone server with 16 cores and 64GB of RAM. I have both the master and worker running on the server. I don\'t have dynamic allocation enabled. I am on

相关标签:
1条回答
  • 2020-11-27 21:40

    Disclaimer: I really don't know if --num-executors should work or not in standalone mode. I haven't seen it used outside YARN.

    Note: As pointed out by Marco --num-executors is no longer in use on YARN.

    You can effectively control number of executors in standalone mode with static allocation (this works on Mesos as well) by combining spark.cores.max and spark.executor.cores where number of executors is determined as:

    floor(spark.cores.max / spark.executor.cores)
    

    For example:

    --conf "spark.cores.max=4" --conf "spark.executor.cores=2"
    
    0 讨论(0)
提交回复
热议问题