Intellij IDEA java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object

和自甴很熟 提交于 2019-12-03 05:04:40

I had this issue. Agree with Andrzej, idea uses its own compiler, so you have to disable it somehow. Go to Settings->Scala->Worksheet and uncheck "Run worksheet in the compiler process".

Any answer wasn't usefull in my case. Still i found a solution which worked for me.. It was problem with scalatest version. In pom.xml uprade to

  <dependency>
        <groupId>org.scalatest</groupId>
        <artifactId>scalatest_2.11</artifactId>
         <version>2.2.4</version>
         <scope>test</scope>
  </dependency>

helped

So, although the above didn't solve my problem it is related to intellij.

Basically, it was preferring the Scala SDK to resolve the Class::method instead of loading from the dependencies.

I used

-verbose:class

in the JVM switch to have it show me where it was looking; immediately cluing me into it being because it's trying to load the class from the Scala SDK (it would expect it to pull in libs from Maven).

I literally just deleted the Scala SDK from my project settings and the problem went away. So far, my experience with Scala (and definitely in a mixed Java environment) leads me to believe it has a ways to go to mature. This is such a fundamental class/method I can't believe it vanished between versions. The scala version I had installed was 2.11. Apparently what get's pulled in is 2.10.4 from maven.

Anytime you see "NoSuchMethodError" it always means there is a version conflict; it's a question of why.

Like others said here, I was having the same problem due I had some libraries using 2.10 in spite of having scalatest at 2.11.

<!-- http://www.scalactic.org/ -->
<dependency>
    <groupId>org.scalactic</groupId>
    <artifactId>scalactic_2.11</artifactId>
    <version>${scalactic.version}</version>
    <scope>test</scope>
</dependency>

<!-- http://www.scalatest.org/ -->
<dependency>
    <groupId>org.scalatest</groupId>
    <artifactId>scalatest_2.11</artifactId>
    <version>${scalactic.version}</version>
    <scope>test</scope>
</dependency>

Chech that all libraries that you are using are in same Scala version

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
</dependency>

To

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
</dependency>

Having as properties

<properties>
    <scala.tools.version>2.11.8</scala.tools.version>
    <scala.version>2.11.8</scala.version>
    <scalactic.version>3.0.0</scalactic.version>

    <!-- Library Versions -->
    <spark.version>2.0.0</spark.version>
    ....
</properties>

I just encountered the same problem. Turned out that I had downloaded the wrong version of Akka which included scala-library-2.10.x, while my project uses 2.11.6. Grabbing the latest version of Akka, which includes 2.11.5, solved the problem.

So, it seems this is a compatibility issue, so I would check dependencies in the future.

Error java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object

Reason This error is specifically due to version mismatch between spark and scala. I faced the error while I was using spark 2.2.0 and scala 2.10.6. Then I changed to different scala versions but I got no success.

Resolution This error is resolved only when I changed the scala version to 2.11.6 . This version was a perfect match for spark 2.2.0. May be you can try higher versions of scala for the same issue , but I tried for 2.12.x but didnt work.

Suggestion Request you to set the below versions before doing any coding: spark - 2.2.0 scala - 2.11.6

Also I used the below pom :

<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.2.0</version>
    </dependency>
</dependencies>

I have the same problem. When you change it to using map function it works! I don't know why but thats how to fix it.

I have found that this can be caused by having differing versions of scalatest and scalamock. The Following Maven

<dependency>
        <groupId>org.scalatest</groupId>
        <artifactId>scalatest_2.11</artifactId><!-- this was previously 2.10 -->
        <version>2.2.4</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.scalamock</groupId>
        <artifactId>scalamock-scalatest-support_2.11</artifactId>
        <version>3.2</version>
        <scope>test</scope>
    </dependency>

I had the same thing when adding json4. i solved it by changing the artifactId from json4s-native_2.12 to - json4s-native_2.11. I guess this is related to the scala version you are using mine was 2.11 and not 2.12 (you can see yours in the properties xml node in the pom.xml file, mine is : <scala.version>2.11</scala.version>.)

I solved this by set my scala sdk version in my project from 2.12 to 2.11.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!