Spark - How to run a standalone cluster locally

前端 未结 4 2044
别那么骄傲
别那么骄傲 2021-01-31 10:45

Is there the possibility to run the Spark standalone cluster locally on just one machine (which is basically different from just developing jobs locally (i.e., lo

4条回答
  •  一整个雨季
    2021-01-31 11:17

    A small update as for the latest version (the 2.1.0), the default is to bind the master to the hostname, so when starting a worker locally use the output of hostname:

    ./bin/spark-class org.apache.spark.deploy.worker.Worker  spark://`hostname`:7077 -c 1 -m 512M
    

    And to run an example, simply run the following command:

    bin/run-example SparkPi
    

提交回复
热议问题