SBT test does not work for spark test

后端 未结 5 746
误落风尘
误落风尘 2021-01-04 21:17

I have a simple spark function to test DF windowing:

    import org.apache.spark.sql.{DataFrame, SparkSession}

    object ScratchPad {

      def main(args:         


        
5条回答
  •  一生所求
    2021-01-04 21:51

    I have solved this problem by excluding wrong version of Derby and including right in build.sbt:

    project.settings(libraryDependencies ++= Seq(
    "org.apache.derby" % "derby" % "10.11.1.1" % Test)
      .map {
          case module if module.name.contains("spark") => module.excludeAll(
              ExclusionRule(organization = "org.apache.derby"))
          case module => module
      })
    

    It doesn't use hacks. Just manual dependency resolution.

提交回复
热议问题