How to connect master and slaves in Apache-Spark? (Standalone Mode)

后端 未结 3 636
旧巷少年郎
旧巷少年郎 2021-02-03 12:29

I\'m using Spark Standalone Mode tutorial page to install Spark in Standalone mode.

1- I have started a master by:

./sbin/start-master.sh
3条回答
  •  礼貌的吻别
    2021-02-03 13:13

    I usually start from spark-env.sh template. And I set, properties that I need. For simple cluster you need:

    • SPARK_MASTER_IP

    Then, create a file called "slaves" in the same directory as spark-env.sh and slaves ip's (one per line). Assure you reach all slaves through ssh.

    Finally, copy this configuration in every machine of your cluster. Then start the entire cluster executing start-all.sh script and try spark-shell to check your configuration.

    > sbin/start-all.sh
    > bin/spark-shell
    

提交回复
热议问题