Multiple SparkSessions in single JVM

后端 未结 4 1809
没有蜡笔的小新
没有蜡笔的小新 2021-02-05 10:15

I have a query regarding creating multiple spark sessions in one JVM. I have read that creating multiple contexts is not recommended in earlier versions of Spark. Is it true wit

4条回答
  •  情话喂你
    2021-02-05 10:43

    If you have an existing spark session and want to create new one, use the newSession method on the existing SparkSession.

    import org.apache.spark.sql.{SQLContext, SparkSession}
    val newSparkSession = spark.newSession()
    

    The newSession method creates a new spark session with isolated SQL configurations, temporary tables.The new session will share the underlying SparkContext and cached data.

提交回复
热议问题