Customize SparkContext using sparkConf.set(..) when using spark-shell
In Spark, there are 3 primary ways to specify the options for the SparkConf used to create the SparkContext : As properties in the conf/spark-defaults.conf e.g., the line: spark.driver.memory 4g As args to spark-shell or spark-submit e.g., spark-shell --driver-memory 4g ... In your source code, configuring a SparkConf instance before using it to create the SparkContext : e.g., sparkConf.set( "spark.driver.memory", "4g" ) However, when using spark-shell , the SparkContext is already created for you by the time you get a shell prompt, in the variable named sc . When using spark-shell, how do you