Pyspark S3 error: java.lang.NoClassDefFoundError: com/amazonaws/services/s3/model/MultiObjectDeleteException

后端 未结 3 1692
Happy的楠姐
Happy的楠姐 2021-01-24 06:29

Been unsuccessful setting a spark cluster that can read AWS s3 files. The software I used are as follows:

  1. hadoop-aws-3.2.0.jar
  2. aws-java-sdk-1.11.887.jar<
3条回答
  •  盖世英雄少女心
    2021-01-24 07:32

    So I cleaned-up everything and re-installed the following versions of jars and it worked: hadoop-aws-2.7.4.jar, aws-java-sdk-1.7.4.2.jar. Spark install version: spark-2.4.7-bin-hadoop2.7. Python version: Python 3.6.

提交回复
热议问题