Why do Scala 2.11 and Spark with scallop lead to “java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror”?

前端 未结 2 757
死守一世寂寞
死守一世寂寞 2021-01-11 10:46

I am using Scala 2.11, Spark, and Scallop (https://github.com/scallop/scallop). I used sbt to build an application fat jar without Spark provided dependencies (this is at: <

相关标签:
2条回答
  • 2021-01-11 11:20

    The issue is that you've used incompatible Scala versions, i.e. Spark was compiled with Scala 2.10 and you were trying to use Scala 2.11.

    Move everything to Scala 2.10 version and make sure you update your SBT as well.

    You may also try to compile Spark sources for Scala 2.11.7 and use it instead.

    0 讨论(0)
  • 2021-01-11 11:36

    I was also encountered with the same issue with spark-submit, in my case:

    Spark Job was compiled with : Scala 2.10.8

    Scala version with which Spark was compiled on the cluster: Scala 2.11.8

    To check the Spark version and Scala version on the cluster use "spark-shell" command.

    After compiling the Spark Job source with Scala 2.11.8 then submitted the job & it worked !!!.

    0 讨论(0)
提交回复
热议问题