spark-submit on standalone cluster complain about scala-2.10 jars not exist

☆樱花仙子☆ 提交于 2019-12-24 06:36:04

问题


I'm new to Spark and downloaded a pre-compiled Spark binaries from Apache (Spark-2.1.0-bin-hadoop2.7)

When submitting my scala (2.11.8) uber jar the cluster throw and error:

java.lang.IllegalStateException: Library directory '/root/spark/assembly/target/scala-2.10/jars' does not exist; make sure Spark is built

I'm not running Scala 2.10 and Spark isn't compiled (as much as I know) with Scala 2.10

Could it be that one of my dependencies is based on Scala 2.10 ?
Any suggestions what can be wrong ?


回答1:


Note sure what is wrong with the pre-built spark-2.1.0 but I've just downloaded spark 2.2.0 and it is working great.




回答2:


Try setting SPARK_HOME="location to your spark installation" on your system or IDE



来源:https://stackoverflow.com/questions/45293666/spark-submit-on-standalone-cluster-complain-about-scala-2-10-jars-not-exist

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!