Is it possible to get the current spark context settings in PySpark?

前端 未结 13 1156
孤街浪徒
孤街浪徒 2021-01-29 22:05

I\'m trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read

13条回答
  •  猫巷女王i
    2021-01-29 22:37

    update configuration in Spark 2.3.1

    To change the default spark configurations you can follow these steps:

    Import the required classes

    from pyspark.conf import SparkConf
    from pyspark.sql import SparkSession
    

    Get the default configurations

    spark.sparkContext._conf.getAll()
    

    Update the default configurations

    conf = spark.sparkContext._conf.setAll([('spark.executor.memory', '4g'), ('spark.app.name', 'Spark Updated Conf'), ('spark.executor.cores', '4'), ('spark.cores.max', '4'), ('spark.driver.memory','4g')])
    

    Stop the current Spark Session

    spark.sparkContext.stop()
    

    Create a Spark Session

    spark = SparkSession.builder.config(conf=conf).getOrCreate()
    

提交回复
热议问题