Is it possible to get the current spark context settings in PySpark?

前端 未结 13 1121
孤街浪徒
孤街浪徒 2021-01-29 22:05

I\'m trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read

13条回答
  •  后悔当初
    2021-01-29 22:25

    You can use:

    sc.sparkContext.getConf.getAll
    

    For example, I often have the following at the top of my Spark programs:

    logger.info(sc.sparkContext.getConf.getAll.mkString("\n"))
    

提交回复
热议问题