Spark - How to run a standalone cluster locally

前端 未结 4 2043
别那么骄傲
别那么骄傲 2021-01-31 10:45

Is there the possibility to run the Spark standalone cluster locally on just one machine (which is basically different from just developing jobs locally (i.e., lo

4条回答
  •  闹比i
    闹比i (楼主)
    2021-01-31 11:16

    yes you can do it, launch one master and one worker node and you are good to go

    launch master

    ./sbin/start-master.sh
    

    launch worker

    ./bin/spark-class org.apache.spark.deploy.worker.Worker  spark://localhost:7077 -c 1 -m 512M
    

    run SparkPi example

    ./bin/spark-submit  --class org.apache.spark.examples.SparkPi   --master spark://localhost:7077  lib/spark-examples-1.2.1-hadoop2.4.0.jar 
    

    Apache Spark Standalone Mode Documentation

提交回复
热议问题