Zeppelin: How to restart sparkContext in zeppelin

前端 未结 4 1085
轻奢々
轻奢々 2021-02-07 18:27

I am using Isolated mode of zeppelins spark interpreter, with this mode it will start a new job for each notebook in spark cluster. I want to kill the job via zeppelin when the

4条回答
  •  [愿得一人]
    2021-02-07 18:59

    While working with Zeppelin and Spark I also stumbled upon the same problem and made some investigations. After some time, my first conclusion was that:

    • Stopping the SparkContext can be accomplished by using sc.stop() in a paragraph
    • Restarting the SparkContext only works by using the UI (Menu -> Interpreter -> Spark Interpreter -> click on restart button)

    However, since the UI allows restarting the Spark Interpreter via a button press, why not just reverse engineer the API call of the restart button! The result was, that restarting the Spark Interpreter sends the following HTTP request:

    PUT http://localhost:8080/api/interpreter/setting/restart/spark
    

    Fortunately, Zeppelin has the ability to work with multiple interpreters, where one of them is also a shell Interpreter. Therefore, i created two paragraphs:

    The first paragraph was for stopping the SparkContext whenever needed:

    %spark
    // stop SparkContext
    sc.stop()
    

    The second paragraph was for restarting the SparkContext programmatically:

    %sh
    # restart SparkContext
    curl -X PUT http://localhost:8080/api/interpreter/setting/restart/spark
    

    After stopping and restarting the SparkContext with the two paragraphs, I run another paragraph to check if restarting worked...and it worked! So while this is no official solution and more of a workaround, it is still legit as we do nothing else than "pressing" the restart button within a paragraph!

    Zeppelin version: 0.8.1

提交回复
热议问题