Is it possible to get the current spark context settings in PySpark?

前端 未结 13 1155
孤街浪徒
孤街浪徒 2021-01-29 22:05

I\'m trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read

13条回答
  •  旧巷少年郎
    2021-01-29 22:27

    Suppose I want to increase the driver memory in runtime using Spark Session:

    s2 = SparkSession.builder.config("spark.driver.memory", "29g").getOrCreate()
    

    Now I want to view the updated settings:

    s2.conf.get("spark.driver.memory")
    

    To get all the settings, you can make use of spark.sparkContext._conf.getAll()

    Hope this helps

提交回复
热议问题