问题
i am getting an error when i am trying to run a spark application with cassandra.
Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243).
I am using spark version 1.2.0 and its clear that i am only using one spark context in my application. But whenever i try to add following code for streaming purpose am getting this error.
JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(
sparkConf, new Duration(1000));
回答1:
You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.
Use this constructor
JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)
回答2:
Take a look at the second code snippet here enter link description here
This is how your code should look like
import org.apache.spark.streaming.api.java.*;
JavaSparkContext existingSparkContext = ... //existing JavaSparkContext
JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(existingSparkContext, Durations.seconds(1000));
来源:https://stackoverflow.com/questions/29699562/exception-in-thread-main-org-apache-spark-sparkexception-only-one-sparkcontex