Exception in thread “main” org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243)

后端 未结 3 936
说谎
说谎 2021-01-07 01:58

i am getting an error when i am trying to run a spark application with cassandra.

Exception in thread \"main\" org.apache.spark.SparkException: Only one Spar         


        
相关标签:
3条回答
  • 2021-01-07 02:44

    You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.

    Use this constructor JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)

    0 讨论(0)
  • 2021-01-07 02:45

    Take a look at the second code snippet here enter link description here

    This is how your code should look like

    import org.apache.spark.streaming.api.java.*;
    
    JavaSparkContext existingSparkContext = ...   //existing JavaSparkContext
    JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(existingSparkContext, Durations.seconds(1000));
    
    0 讨论(0)
  • 2021-01-07 02:50

    One way could be as follows:

        SparkConf sparkConf = new SparkConf().setAppName("Example Spark App").setMaster("local[*]");
        JavaSparkContext jssc = new JavaSparkContext(sparkConf);
        JavaStreamingContext jsc = new JavaStreamingContext(jssc, new Duration(1));
    
    0 讨论(0)
提交回复
热议问题