I am using Isolated mode of zeppelins spark interpreter, with this mode it will start a new job for each notebook in spark cluster. I want to kill the job via zeppelin when the
It's a bit counter intuitive but you need to access the interpreter menu tab instead of stopping SparkContext directly:
SparkContext
go to interpreter list.
find Spark interpreter and click restart in the right upper corner: