Exception in thread “main” org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243)

妖精的绣舞 提交于 2019-12-12 07:21:57

问题


i am getting an error when i am trying to run a spark application with cassandra.

Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). 

I am using spark version 1.2.0 and its clear that i am only using one spark context in my application. But whenever i try to add following code for streaming purpose am getting this error.

JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(
            sparkConf, new Duration(1000));

回答1:


You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.

Use this constructor JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)




回答2:


Take a look at the second code snippet here enter link description here

This is how your code should look like

import org.apache.spark.streaming.api.java.*;

JavaSparkContext existingSparkContext = ...   //existing JavaSparkContext
JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(existingSparkContext, Durations.seconds(1000));


来源:https://stackoverflow.com/questions/29699562/exception-in-thread-main-org-apache-spark-sparkexception-only-one-sparkcontex

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!