Spark standalone configuration having multiple executors

前端 未结 2 398
时光说笑
时光说笑 2020-12-15 01:11

I\'m trying to setup a standalone Spark 2.0 server to process an analytics function in parallel. To do this I want to have a single worker with multiple executors.

2条回答
  •  醉梦人生
    2020-12-15 02:04

    I believe you mixed up local and standalone modes:

    • Local mode is a development tool where all processes are executed inside a single JVM. Application is started in a local mode by setting master to local, local[*] or local[n]. spark.executor.cores and spark.executor.cores are not applicable in the local mode because there is only one embedded executor.
    • Standalone mode requires a standalone Spark cluster. It requires a master node (can be started using SPARK_HOME/sbin/start-master.sh script) and at least one worker node (can be started using SPARK_HOME/sbin/start-slave.sh script).

      SparkConf should use master node address to create (spark://host:port).

提交回复
热议问题