Why “could not find implicit” error in Scala + Intellij + ScalaTest + Scalactic but not from sbt

前端 未结 6 1272
清歌不尽
清歌不尽 2021-01-07 23:55

I have this code that is working 100% from sbt , executing sbt test but throw a compilation error in Intellij Idea.

import org.         


        
相关标签:
6条回答
  • 2021-01-08 00:03

    Also make sure your project JDK is set to JDK 8. Scala is not compatible with JDK 11 which is the default now in IntelliJ.

    The same also happened with Maven.

    I had a project where everything worked fine. After the latest IntelliJ upgrade it forgot the JDK setting. I did all the steps in the answers but none of them helped. As a last resort, I reinstalled IntelliJ from scratch, checked out a clean repo (no .idea folder or .iml files) and... didn't help. Then during setting up the project again I noticed JDK 11. It rang me a bell, added JDK 8, and there you go. Test are green again.

    0 讨论(0)
  • 2021-01-08 00:08

    Not sure if this was an IDE bug, but for me upgrading the IDE to latest didn't proved to be of any help. After wasting few hours here is my approach to resolve this error. Which states following.

    could not find implicit value for parameter prettifier: org.scalactic.Prettifier
    

    Solution :

    In IntelliJ press Ctrl+Alt+Shift+S -> Modules -> Dependencies -> Search for 
    org.scalactic:3.0.0.jar (Test scope) and most probably there would be 
    another version as 2.x.x in compile scope. Right click on 2.x.x and select 
    EDIT and then choose the 3.0.0 version in compile scope, and apply new 
    settings.
    
    P.S. Depending on your case there may be only one entry but make sure you 
    use 3.0.0 in compile scope to get rid of that weird error.
    
    0 讨论(0)
  • 2021-01-08 00:09

    I had similar issue.

    For me, simplest way to solve this was just removing .idea folder and re-importing the project.

    0 讨论(0)
  • 2021-01-08 00:12

    As mentioned in issue 170, it can be a issue with mixup of spark-testing-base dependency.

    Make sure you are not mixing the dependency.

    I had the following dependencies

    libraryDependencies ++= Seq(
      "org.apache.spark" % "spark-core_2.11" % "2.1.0",
      "org.apache.spark" % "spark-sql_2.11" % "2.1.0",
      "org.apache.spark" % "spark-streaming_2.11" % "2.1.0",
      "org.apache.spark" % "spark-mllib_2.11" % "2.1.0",
      "com.holdenkarau" %% "spark-testing-base" % "2.1.0_0.8.0" % "test",
      "org.scalatest" % "scalatest_2.11" % "2.1.0" % "test",
      "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0",
      "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0" classifier "models"
    )
    

    And when i tried to run test classes I was getting

    Error:(32, 14) could not find implicit value for parameter pos: org.scalactic.source.Position test("lsi"){ Error:(32, 14) not enough arguments for method test: (implicit pos: org.scalactic.source.Position)Unit. Unspecified value parameter pos. test("lsi"){ ..........

    Then I change the dependencies to

    libraryDependencies ++= Seq(
      "org.apache.spark" % "spark-core_2.11" % "2.2.0",
      "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
      "org.apache.spark" % "spark-streaming_2.11" % "2.2.0",
      "org.apache.spark" % "spark-mllib_2.11" % "2.2.0",
      "com.holdenkarau" %% "spark-testing-base" % "2.2.0_0.8.0" % "test",
      "org.scalatest" % "scalatest_2.11" % "2.2.2" % "test",
      "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0",
      "edu.stanford.nlp" % "stanford-corenlp" % "3.8.0" classifier "models"
    )
    

    Re-imported my project (as clean and package didn't work)

    And the test classes passed.

    0 讨论(0)
  • 2021-01-08 00:15

    Workarounds at the bottom of the response. ;)

    This problem is related with this list of BUGs:

    • SCL-8167
    • SCL-11083
    • SCL-8396
    • SCL-10912

    The problem is that there are dependencies in the project that are using, using test scope, other versions of scalatest and scalactic.

    IntelliJ Idea is mixing compile scope and test scope, but SBT is working correctly. IntelliJ Idea team said in the BUG that they are working in this.

    My workaround, at the moment, has been move to the same older version that the other libraries are using for testing.

    Notes:

    @justin-kaeser is assigned and working to fix this. Thx!

    A lot of improvement related to the Scala plugin in that latest previews.

    Example to reproduce the error : https://github.com/angelcervera/idea-dependencies-bug

    Few Workarounds:

    1. Remove problematic dependencies from the Project structure -> Modules
    2. Exclude libraries in the sbt.
    3. Use the same version.
    4. Try with the last EAP: https://www.jetbrains.com/idea/nextversion/
    0 讨论(0)
  • 2021-01-08 00:17

    It's possible some dependencies are transitively including incompatible versions of Scalactic or Scalatest in the compile scope, which also are included in the test scope.

    You can check this in the Project Structure under Project Settings / Modules / Dependencies tab, and analyze it more closely with the sbt-dependency-graph plugin.

    SBT does however perform dependency evictions which IntelliJ does not (issue), which can cause additional problems when compiling from the IDE. If sbt-dependency-graph shows that the conflicting versions are evicted, then it is probably an instance of this issue.

    Workaround: when you find the offending transitive dependency, exclude it from the root dependency in your build.sbt. For example:

    "org.apache.spark" %% "spark-core" % "2.1.0" % "provided" exclude("org.scalatest", "scalatest_2.11")
    
    0 讨论(0)
提交回复
热议问题