How can I set spark.task.maxFailures on AWS databricks?

后端 未结 0 980
暗喜
暗喜 2021-02-06 06:27

I would like to set spark.task.maxFailures to value more than 4. Using Databricks 6.4 runtime, how can I set this value?

When I execute spark.conf.get("spark.task.ma

相关标签:
回答
  • 消灭零回复
提交回复
热议问题