SPARK 2.3 is throwing following exception. Can anyone please help!! I tried adding the JARs
308 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster -
This issue plagues due to mismatch of the version that Hadoop and Spark are compiled on for Netty. So you can follow this.
Similar Issue , solved by manually compiling the Spark by using specific version of Netty
And the other one as recommended by Suhas , by copying the content of SPARK_HOME/jars folder to the various lib folder or only the one in yarn inside HADOOP_HOME/share/hadoop solves the problem also. But it's a dirty fix. So maybe use latest version of both or manually compile them.
An older version of Netty was required by the aws-java-sdk. Deleting all the netty jars and removing the aws-java-sdk from the project solved the problem.
I found the solution. This is because hadoop binaries compiled with older version and need us to just replace them. I did not faced any issue with hadoop by replacing them.
you need to replace netty-3.6.2.Final.jar
and netty-all-4.0.23.Final.jar
from path $HADOOP_HOME\share\hadoop
with netty-all-4.1.17.Final.jar
and netty-3.9.9.Final.jar
This solved my problem. If you have alternate solution please do share.