I have a query regarding creating multiple spark sessions in one JVM. I have read that creating multiple contexts is not recommended in earlier versions of Spark. Is it true wit
If you have an existing spark session and want to create new one, use the newSession method on the existing SparkSession.
import org.apache.spark.sql.{SQLContext, SparkSession}
val newSparkSession = spark.newSession()
The newSession method creates a new spark session with isolated SQL configurations, temporary tables.The new session will share the underlying SparkContext
and cached data.