java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror

不羁岁月 提交于 2019-12-12 04:45:16

问题


java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
    at org.elasticsearch.spark.serialization.ReflectionUtils$.org$elasticsearch$spark$serialization$ReflectionUtils$$checkCaseClass(ReflectionUtils.scala:42)
    at org.elasticsearch.spark.serialization.ReflectionUtils$$anonfun$checkCaseClassCache$1.apply(ReflectionUtils.scala:84)

it is seems scala version uncompatible,but i see the document of spark ,spark 2.10 and scala 2.11.8 is ok.

that is my pom.xml and that is just a test for spark to write to elasticsearch with es-hadoop,i have no idea how to solve this exception. `

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>cn.jhTian</groupId>
    <artifactId>sparkLink</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>
    <name>${project.artifactId}</name>
    <description>My wonderfull scala app</description>
    <inceptionYear>2015</inceptionYear>
    <licenses>
        <license>
            <name>My License</name>
            <url>http://....</url>
            <distribution>repo</distribution>
        </license>
    </licenses>

    <properties>
        <encoding>UTF-8</encoding>
        <scala.version>2.11.8</scala.version>
        <scala.compat.version>2.11</scala.compat.version>

    </properties>

    <repositories>
        <repository>
            <id>ainemo</id>
            <name>xylink</name>
            <url>http://10.170.209.180:8081/nexus/content/groups/public/</url>
        </repository>
    </repositories>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.4</version><!-- 2.64 -->
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <!--<dependency>-->
            <!--<groupId>org.scala-lang</groupId>-->
            <!--<artifactId>scala-compiler</artifactId>-->
            <!--<version>${scala.version}</version>-->
        <!--</dependency>-->
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-reflect</artifactId>
            <version>${scala.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>2.6.4</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>3.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-hadoop</artifactId>
            <version>5.3.0 </version>
        </dependency>

        <!-- Test -->
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.10</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.specs2</groupId>
            <artifactId>specs2-core_${scala.compat.version}</artifactId>
            <version>2.4.16</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.scalatest</groupId>
            <artifactId>scalatest_${scala.compat.version}</artifactId>
            <version>2.2.4</version>
            <scope>test</scope>
        </dependency>
    </dependencies>
</project>'

this is my code

import org.apache.spark.{SparkConf, SparkContext}
import org.elasticsearch.spark._

/**
  * Created by jhTian on 2017/4/19.
  */
object EsWrite {
  def main(args: Array[String]) {
    val sparkConf = new SparkConf()
      .set("es.nodes", "1.1.1.1")
      .set("es.port", "9200")
      .set("es.index.auto.create", "true")
      .setAppName("es-spark-demo")
    val sc = new SparkContext(sparkConf)
    val job1 = Job("C开发工程师","http://job.c.com","c公司","10000")
    val job2 = Job("C++开发工程师","http://job.c++.com","c++公司","10000")
    val job3 = Job("C#开发工程师","http://job.c#.com","c#公司","10000")
    val job4 = Job("Java开发工程师","http://job.java.com","java公司","10000")
    val job5 = Job("Scala开发工程师","http://job.scala.com","java公司","10000")
//    val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
//    val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
//    val rdd=sc.makeRDD(Seq(numbers,airports))
    val rdd=sc.makeRDD(Seq(job1,job2,job3,job4,job5))
    rdd.saveToEs("job/info")
    sc.stop()
  }

}
case class Job(jobName:String, jobUrl:String, companyName:String, salary:String)'

回答1:


Generally NoSuchMethodError implies the caller was compiled with a different version than was found on the classpath at runtime (or you have multiple versions on the CP).

In your case, I'd guess that es-hadoop is built against a different version of Scala I've not used maven in a little while but I think the command you need to get some useful into is mvn depdencyTree. Use the output to see which version of Scala es-hadoop is built with and then configure your project to use the same Scala version.

To get stable/reproducible builds I'd recommend using something like the maven-enforcer-plugin:

<plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-enforcer-plugin</artifactId>
                <version>1.4.1</version>
                <executions>
                    <execution>
                        <id>enforce</id>
                        <configuration>
                            <rules>
                                <dependencyConvergence />
                            </rules>
                        </configuration>
                        <goals>
     <goal>enforce</goal>
    </goals>
</execution>
</executions>
</plugin>

it can be annoying initially but once you have all your dependencies sorted you shouldn't get issues like this anymore.




回答2:


use dependency like this

<dependency>
            <groupId>org.elasticsearch</groupId>
            <artifactId>elasticsearch-spark-20_2.11</artifactId>
            <version>5.2.2</version>
        </dependency>

for spark 2.0 and scala 2.11



来源:https://stackoverflow.com/questions/43512865/java-lang-nosuchmethoderror-scala-reflect-api-javauniverse-runtimemirror

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!