kryo

Customize SparkContext using sparkConf.set(..) when using spark-shell

纵饮孤独 提交于 2019-11-26 20:25:02
In Spark, there are 3 primary ways to specify the options for the SparkConf used to create the SparkContext : As properties in the conf/spark-defaults.conf e.g., the line: spark.driver.memory 4g As args to spark-shell or spark-submit e.g., spark-shell --driver-memory 4g ... In your source code, configuring a SparkConf instance before using it to create the SparkContext : e.g., sparkConf.set( "spark.driver.memory", "4g" ) However, when using spark-shell , the SparkContext is already created for you by the time you get a shell prompt, in the variable named sc . When using spark-shell, how do you

Customize SparkContext using sparkConf.set(..) when using spark-shell

…衆ロ難τιáo~ 提交于 2019-11-26 07:35:56
问题 In Spark, there are 3 primary ways to specify the options for the SparkConf used to create the SparkContext : As properties in the conf/spark-defaults.conf e.g., the line: spark.driver.memory 4g As args to spark-shell or spark-submit e.g., spark-shell --driver-memory 4g ... In your source code, configuring a SparkConf instance before using it to create the SparkContext : e.g., sparkConf.set( \"spark.driver.memory\", \"4g\" ) However, when using spark-shell , the SparkContext is already