Pyspark S3 error: java.lang.NoClassDefFoundError: com/amazonaws/services/s3/model/MultiObjectDeleteException

后端 未结 3 1689
Happy的楠姐
Happy的楠姐 2021-01-24 06:29

Been unsuccessful setting a spark cluster that can read AWS s3 files. The software I used are as follows:

  1. hadoop-aws-3.2.0.jar
  2. aws-java-sdk-1.11.887.jar<
3条回答
  •  走了就别回头了
    2021-01-24 07:24

    I was able to solve this issue on Spark 3.0/ Hadoop 3.2. I documented my answer here as well - AWS EKS Spark 3.0, Hadoop 3.2 Error - NoClassDefFoundError: com/amazonaws/services/s3/model/MultiObjectDeleteException

    Use following AWS Java SDK bundle and this issue will be solved -

    aws-java-sdk-bundle-1.11.874.jar (https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk-bundle/1.11.874)

提交回复
热议问题