Is it possible to get the current spark context settings in PySpark?

前端 未结 13 1125
孤街浪徒
孤街浪徒 2021-01-29 22:05

I\'m trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read

13条回答
  •  说谎
    说谎 (楼主)
    2021-01-29 22:23

    Not sure if you can get all the default settings easily, but specifically for the worker dir, it's quite straigt-forward:

    from pyspark import SparkFiles
    print SparkFiles.getRootDirectory()
    

提交回复
热议问题