Job cancelled because SparkContext was shut down

前端 未结 1 1797
北荒
北荒 2021-01-15 01:26

While running my spark program in jupyter notebook I got the error \"Job cancelled because SparkContext was shut down\".I am using spark without hadoop.The same prog

相关标签:
1条回答
  • 2021-01-15 02:14

    This problem is solved now.I have to create a checkpoint directory as number of iterations was more than 20 for training. The code for creating checkpoint directory is:

    SparkContext.setCheckpointDir("path to directory")
    
    0 讨论(0)
提交回复
热议问题