I am running pyspark application through Jupyter notebook. I can kill a job using Spark Web UI, but I want to kill it programmatically.
How can I kill it ???
Suppose that you wrote this code:
from pyspark import SparkContext sc = SparkContext("local", "Simple App") # This will stop your app sc.stop()
As descibes in the docs: http://spark.apache.org/docs/latest/api/python/pyspark.html?highlight=stop#pyspark.SparkContext.stop