SparkException using JavaStreamingContext.getOrCreate(): Only one SparkContext may be running in this JVM

喜你入骨 提交于 2019-12-02 19:53:11

问题


Related to this question, I got the tip that the getOrCreate idiom should be used to avoid this issues. But trying:

JavaStreamingContextFactory contextFactory = new JavaStreamingContextFactory() {

    @Override
    public JavaStreamingContext create() {
        final SparkConf conf = new SparkConf().setAppName(NAME);
        return new JavaStreamingContext(conf, Durations.seconds(BATCH_SPAN));
    }

};

final JavaStreamingContext context = JavaStreamingContext.getOrCreate("/tmp/"+NAME, contextFactory);

I'm still getting:

Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
org.example.ExamplePipeline$1.create(ExamplePipeline.java:56)
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:706)
org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:705)
scala.Option.getOrElse(Option.scala:120)
org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:705)
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
org.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:70)
org.example.ExamplePipeline.exec(ExamplePipeline.java:116)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1702)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1641)
org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1570)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
    at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
    at scala.Option.foreach(Option.scala:236)
    at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2239)
    at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2325)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:2197)
    at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:874)
    at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:140)
    at org.example.ExamplePipeline$1.create(ExamplePipeline.java:56)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:706)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$$anonfun$7.apply(JavaStreamingContext.scala:705)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
    at org.apache.spark.streaming.api.java.JavaStreamingContext$.getOrCreate(JavaStreamingContext.scala:705)
    at org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(JavaStreamingContext.scala)
    at org.example.ExamplePipeline.createExecutionContext(ExamplePipeline.java:70)
    at org.example.ExamplePipeline.exec(ExamplePipeline.java:116)
    at org.example.ExamplePipeline.main(ExamplePipeline.java:157)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:786)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:183)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:208)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:123)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

What am I suppose of being doing wrong?

Thanks in advance.


回答1:


According to this question I think this is how I should do it:

SparkConf conf = new SparkConf().setAppName(NAME);
JavaSparkContext ctx = JavaSparkContext.fromSparkContext(SparkContext.getOrCreate(conf));
JavaStreamingContext context = new JavaStreamingContext(ctx, Durations.seconds(BATCH_SPAN));

Right?



来源:https://stackoverflow.com/questions/41722984/sparkexception-using-javastreamingcontext-getorcreate-only-one-sparkcontext-m

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!