Unable to run Unit tests (scalatest) on Spark-2.2.0 - Scala-2.11.8

不打扰是莪最后的温柔 提交于 2020-01-24 00:51:14

问题


Unable to run run scalatest with context on spark-2.2.0

StackTrace:

An exception or error caused a run to abort: org.apache.spark.sql.test.SharedSQLContext.eventually(Lorg/scalatest/concurrent/PatienceConfiguration$Timeout;Lscala/Function0;Lorg/scalatest/concurrent/AbstractPatienceConfiguration$PatienceConfig;)Ljava/lang/Object; java.lang.NoSuchMethodError: org.apache.spark.sql.test.SharedSQLContext.eventually(Lorg/scalatest/concurrent/PatienceConfiguration$Timeout;Lscala/Function0;Lorg/scalatest/concurrent/AbstractPatienceConfiguration$PatienceConfig;)Ljava/lang/Object; at org.apache.spark.sql.test.SharedSQLContext$class.afterEach(SharedSQLContext.scala:92) at testUtils.ScalaTestWithContext1.afterEach(ScalaTestWithContext1.scala:7) at org.scalatest.BeforeAndAfterEach$$anonfun$1.apply$mcV$sp(BeforeAndAfterEach.scala:234)

sample code:

  import org.apache.spark.sql.SparkSession
  import testUtils.ScalaTestWithContext1

  class SampLeTest extends ScalaTestWithContext1 {
  override protected def spark: SparkSession = ???

     test("test") {
        1 == 1 shouldBe true
     }
  }

ScalaTestWithContext1.scala

  import org.apache.spark.sql.QueryTest
  import org.apache.spark.sql.test.SharedSQLContext
  import org.scalatest.{BeforeAndAfterAll, Matchers}

  abstract class ScalaTestWithContext extends QueryTest with SharedSQLContext with Matchers with BeforeAndAfterAll{}

build.sbt :

name := "test"
version := "1.0"
scalaVersion := "2.11.11"

parallelExecution in Test := false

libraryDependencies ++= Seq(
  "org.scala-lang" % "scala-library" % "2.11.11" % "provided",
  "org.apache.spark" %% "spark-core" % "2.2.0",
  "org.apache.spark" %% "spark-sql" % "2.2.0",
  "org.apache.spark" %% "spark-catalyst" % "2.2.0",
  "org.apache.spark" %% "spark-core" % "2.2.0" % "test" classifier 
"tests",
  "org.apache.spark" %% "spark-sql" % "2.2.0" % "test" classifier 
"tests",
  "org.apache.spark" %% "spark-catalyst" % "2.2.0" % "test" classifier 
"tests",
  "org.scalatest" %% "scalatest" % "3.0.1" % "test"
) 

The class ScalaTestWithContext1 extends SharedSQLContext and all required traits.

Thanks in Advance.


回答1:


I faced a similar issue. The solution that worked for me was to use version 2.2.6 of Scalatest instead of the 3.x version. The Maven repository also shows the correct dependency in the section "Test Dependencies".




回答2:


Similarly as already pointed out, check out the pom.xml file in Spark's github repository to ensure you're using the same scalatest version.

There's probably a better solution, like having sbt merge or override your preferred version of scalatest over Spark's, but as of Dec 2019 Spark 2.4.4 is using Scalatest 3.0.8, which is fairly recent.



来源:https://stackoverflow.com/questions/45816586/unable-to-run-unit-tests-scalatest-on-spark-2-2-0-scala-2-11-8

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!