SPARK 2.3 is throwing following exception. Can anyone please help!! I tried adding the JARs
308 [Driver] ERROR org.apache.spark.deploy.yarn.ApplicationMaster -
I found the solution. This is because hadoop binaries compiled with older version and need us to just replace them. I did not faced any issue with hadoop by replacing them.
you need to replace netty-3.6.2.Final.jar
and netty-all-4.0.23.Final.jar
from path $HADOOP_HOME\share\hadoop
with netty-all-4.1.17.Final.jar
and netty-3.9.9.Final.jar
This solved my problem. If you have alternate solution please do share.