Exception in thread “main” org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243)

后端 未结 3 937
说谎
说谎 2021-01-07 01:58

i am getting an error when i am trying to run a spark application with cassandra.

Exception in thread \"main\" org.apache.spark.SparkException: Only one Spar         


        
3条回答
  •  一整个雨季
    2021-01-07 02:50

    One way could be as follows:

        SparkConf sparkConf = new SparkConf().setAppName("Example Spark App").setMaster("local[*]");
        JavaSparkContext jssc = new JavaSparkContext(sparkConf);
        JavaStreamingContext jsc = new JavaStreamingContext(jssc, new Duration(1));
    

提交回复
热议问题