Spark tasks blockes randomly on standalone cluster

后端 未结 1 1966
广开言路
广开言路 2021-02-06 17:29

We are having a quite complex application that runs on Spark Standalone. In some cases the tasks from one of the workers blocks randomly for an infinite amount of time in the R

相关标签:
1条回答
  • 2021-02-06 18:01

    The issue was fixed for me by allocating just one core per executor. If I have executors with more then 1 core the issue appears again. I didn't yet understood why is this happening but for the ones having similar issue they can try this.

    0 讨论(0)
提交回复
热议问题