I am using Isolated mode of zeppelins spark interpreter, with this mode it will start a new job for each notebook in spark cluster. I want to kill the job via zeppelin when the
While working with Zeppelin and Spark I also stumbled upon the same problem and made some investigations. After some time, my first conclusion was that:
sc.stop()
in a paragraphrestart
button)However, since the UI allows restarting the Spark Interpreter via a button press, why not just reverse engineer the API call of the restart
button! The result was, that restarting
the Spark Interpreter sends the following HTTP request:
PUT http://localhost:8080/api/interpreter/setting/restart/spark
Fortunately, Zeppelin has the ability to work with multiple interpreters, where one of them is also a shell
Interpreter. Therefore, i created two paragraphs:
The first paragraph was for stopping the SparkContext whenever needed:
%spark
// stop SparkContext
sc.stop()
The second paragraph was for restarting the SparkContext programmatically:
%sh
# restart SparkContext
curl -X PUT http://localhost:8080/api/interpreter/setting/restart/spark
After stopping and restarting the SparkContext with the two paragraphs, I run another paragraph to check if restarting worked...and it worked! So while this is no official solution and more of a workaround, it is still legit as we do nothing else than "pressing" the restart
button within a paragraph!
Zeppelin version: 0.8.1