Can SparkContext and StreamingContext co-exist in the same program?

前端 未结 2 1662
半阙折子戏
半阙折子戏 2020-12-31 12:50

I am trying to set up a Sparkstreaming code which reads line from the Kafka server but processes it using rules written in another local file. I am creating streamingContext

相关标签:
2条回答
  • 2020-12-31 13:05

    One application can only have ONE SparkContext. StreamingContext is created on SparkContext. Just need to create ssc StreamingContext using SparkContext

    val sc = new SparkContext(conf)
    val ssc = new StreamingContext(sc, Seconds(15))
    

    If using the following constructor.

    StreamingContext(conf: SparkConf, batchDuration: Duration)
    

    It internally create another SparkContext

    this(StreamingContext.createNewSparkContext(conf), null, batchDuration)
    

    the SparkContext can get from StreamingContext by

    ssc.sparkContext
    
    0 讨论(0)
  • 2020-12-31 13:19

    yes you can do it you have to first start spark session and

    then use its context to start any number of streaming context

    val spark = SparkSession.builder().appName("someappname").
    config("spark.sql.warehouse.dir",warehouseLocation).getOrCreate()
    
    val ssc = new StreamingContext(spark.sparkContext, Seconds(1))
    

    Simple!!!

    0 讨论(0)
提交回复
热议问题