Is it possible to get the current spark context settings in PySpark?

前端 未结 13 1124
孤街浪徒
孤街浪徒 2021-01-29 22:05

I\'m trying to get the path to spark.worker.dir for the current sparkcontext.

If I explicitly set it as a config param, I can read

相关标签:
13条回答
  • 2021-01-29 22:42

    Simply running

    sc.getConf().getAll()
    

    should give you a list with all settings.

    0 讨论(0)
提交回复
热议问题