Spark 2.3 java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric

后端 未结 3 2003
刺人心
刺人心 2021-01-12 12:52

SPARK 2.3 is throwing following exception. Can anyone please help!! I tried adding the JARs

308 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster -

相关标签:
3条回答
  • 2021-01-12 13:20

    This issue plagues due to mismatch of the version that Hadoop and Spark are compiled on for Netty. So you can follow this.

    Similar Issue , solved by manually compiling the Spark by using specific version of Netty

    And the other one as recommended by Suhas , by copying the content of SPARK_HOME/jars folder to the various lib folder or only the one in yarn inside HADOOP_HOME/share/hadoop solves the problem also. But it's a dirty fix. So maybe use latest version of both or manually compile them.

    0 讨论(0)
  • 2021-01-12 13:20

    An older version of Netty was required by the aws-java-sdk. Deleting all the netty jars and removing the aws-java-sdk from the project solved the problem.

    0 讨论(0)
  • 2021-01-12 13:28

    I found the solution. This is because hadoop binaries compiled with older version and need us to just replace them. I did not faced any issue with hadoop by replacing them.

    you need to replace netty-3.6.2.Final.jar and netty-all-4.0.23.Final.jar from path $HADOOP_HOME\share\hadoop with netty-all-4.1.17.Final.jar and netty-3.9.9.Final.jar

    This solved my problem. If you have alternate solution please do share.

    0 讨论(0)
提交回复
热议问题