terminating a spark step in aws

前端 未结 2 933
挽巷
挽巷 2021-02-05 14:56

I want to set up a series of spark steps on an EMR spark cluster, and terminate the current step if it\'s taking too long. However, when I ssh into the master node and run hadoo

相关标签:
2条回答
  • 2021-02-05 15:07

    That's easy:

    yarn application -kill [application id]
    

    you can list your running applications with

    yarn application -list
    
    0 讨论(0)
  • 2021-02-05 15:13

    You can kill application from the Resource manager (in the links at the top right under cluster status). In the resource manager, click on the application you want to kill and in the application page there is a small "kill" label (top left) you can click to kill the application.

    Obviously you can also SSH but this way I think is faster and easier for some users.

    0 讨论(0)
提交回复
热议问题