SBT test does not work for spark test

后端 未结 5 748
误落风尘
误落风尘 2021-01-04 21:17

I have a simple spark function to test DF windowing:

    import org.apache.spark.sql.{DataFrame, SparkSession}

    object ScratchPad {

      def main(args:         


        
相关标签:
5条回答
  • 2021-01-04 21:32

    By default hive use two Metastores first one meta store service, and second the database called by default metastore_db and it uses derby. so i think you have to install and configure derby with hive. But i have not seen the use of hive in your code. I hope my answer help you

    0 讨论(0)
  • 2021-01-04 21:34

    Folowing quick and dirty hack solves the problem

    System.setSecurityManager(null)
    

    Anyway as it's related to automated tests only maybe it's not that problematic after all ;)

    0 讨论(0)
  • 2021-01-04 21:43

    If you're looking for a cleaner way then inside build.sbt:

    test in Test := {
      System.setSecurityManager(null) // SPARK-22918
      (test in Test).value
    }
    

    This will apply the fix to all tests in all files without touching the test's code.

    0 讨论(0)
  • 2021-01-04 21:51

    I have solved this problem by excluding wrong version of Derby and including right in build.sbt:

    project.settings(libraryDependencies ++= Seq(
    "org.apache.derby" % "derby" % "10.11.1.1" % Test)
      .map {
          case module if module.name.contains("spark") => module.excludeAll(
              ExclusionRule(organization = "org.apache.derby"))
          case module => module
      })
    

    It doesn't use hacks. Just manual dependency resolution.

    0 讨论(0)
  • 2021-01-04 21:56

    Adding this line in the test class to disable hive worked for me

          override implicit def enableHiveSupport: Boolean = false
    

    Got it here:

    https://github.com/holdenk/spark-testing-base/issues/148

    0 讨论(0)
提交回复
热议问题