i am getting an error when i am trying to run a spark application with cassandra.
Exception in thread \"main\" org.apache.spark.SparkException: Only one Spar
You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.
Use this constructor
JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)
Take a look at the second code snippet here enter link description here
This is how your code should look like
import org.apache.spark.streaming.api.java.*;
JavaSparkContext existingSparkContext = ... //existing JavaSparkContext
JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(existingSparkContext, Durations.seconds(1000));
One way could be as follows:
SparkConf sparkConf = new SparkConf().setAppName("Example Spark App").setMaster("local[*]");
JavaSparkContext jssc = new JavaSparkContext(sparkConf);
JavaStreamingContext jsc = new JavaStreamingContext(jssc, new Duration(1));