How to run a Spark Java program

前端 未结 4 819
一个人的身影
一个人的身影 2021-01-29 21:01

I have written a Java program for Spark. But how to run and compile it from Unix command line. Do I have to include any jar while compiling for running

4条回答
  •  深忆病人
    2021-01-29 21:40

    This answer is for Spark 2.3.If you want to test your Spark application locally, ie without the pre-requisite of a Hadoop cluster, and even without having to start any of the standalone Spark services, you could do this:

    JavaSparkContext jsc = new JavaSparkContext(new SparkConf().setAppName("Simple App"));
    

    And then, to run your application locally:

    $SPARK_HOME/bin/spark-submit --class SimpleApp --master local target/scala-2.10/simple-project_2.10-1.0.jar
    

    For this to work , you just need to extract the Spark tar file into $SPARK_HOME, and set $SPARK_HOME into the Spark user's .profile

提交回复
热议问题