I have written a Java program for Spark. But how to run and compile it from Unix command line. Do I have to include any jar while compiling for running
This answer is for Spark 2.3.If you want to test your Spark application locally, ie without the pre-requisite of a Hadoop cluster, and even without having to start any of the standalone Spark services, you could do this:
JavaSparkContext jsc = new JavaSparkContext(new SparkConf().setAppName("Simple App"));
And then, to run your application locally:
$SPARK_HOME/bin/spark-submit --class SimpleApp --master local target/scala-2.10/simple-project_2.10-1.0.jar
For this to work , you just need to extract the Spark tar file into $SPARK_HOME, and set $SPARK_HOME into the Spark user's .profile