Kill Spark Job programmatically

前端 未结 2 1032
清歌不尽
清歌不尽 2021-01-29 06:20

I am running pyspark application through Jupyter notebook. I can kill a job using Spark Web UI, but I want to kill it programmatically.

How can I kill it ???

2条回答
  •  深忆病人
    2021-01-29 06:37

    Suppose that you wrote this code:

    from pyspark import SparkContext
    
    sc = SparkContext("local", "Simple App")
    
    # This will stop your app
    sc.stop()
    

    As descibes in the docs: http://spark.apache.org/docs/latest/api/python/pyspark.html?highlight=stop#pyspark.SparkContext.stop

提交回复
热议问题