Is there the possibility to run the Spark standalone cluster locally on just one machine (which is basically different from just developing jobs locally (i.e., lo
lo
If you can't find the ./sbin/start-master.sh file on your machine, you can start the master also with
./sbin/start-master.sh
./bin/spark-class org.apache.spark.deploy.master.Master