org.apache.spark.SparkException: Job aborted due to stage failure: Task from application

后端 未结 2 617
攒了一身酷
攒了一身酷 2021-02-13 03:43

I have a problem with running spark application on standalone cluster. (I use spark 1.1.0 version). I succesfully run master server by command:

bash start-master         


        
2条回答
  •  暗喜
    暗喜 (楼主)
    2021-02-13 04:37

    For the benefit of others running into this problem:

    I faced an identical issue due to a mismatch between the spark connector and spark version being used. Spark was 1.3.1 and the connector was 1.3.0, an identical error message appeared:

    org.apache.spark.SparkException: Job aborted due to stage failure:
      Task 2 in stage 0.0 failed 4 times, most recent failure: Lost 
      task 2.3 in stage 0.0
    

    Updating the dependancy in SBT solved the problem.

提交回复
热议问题