I have a simple spark function to test DF windowing:
import org.apache.spark.sql.{DataFrame, SparkSession}
object ScratchPad {
def main(args:
By default hive use two Metastores first one meta store service, and second the database called by default metastore_db and it uses derby. so i think you have to install and configure derby with hive. But i have not seen the use of hive in your code. I hope my answer help you
Folowing quick and dirty hack solves the problem
System.setSecurityManager(null)
Anyway as it's related to automated tests only maybe it's not that problematic after all ;)
If you're looking for a cleaner way then inside build.sbt
:
test in Test := {
System.setSecurityManager(null) // SPARK-22918
(test in Test).value
}
This will apply the fix to all tests in all files without touching the test's code.
I have solved this problem by excluding wrong version of Derby and including right in build.sbt
:
project.settings(libraryDependencies ++= Seq(
"org.apache.derby" % "derby" % "10.11.1.1" % Test)
.map {
case module if module.name.contains("spark") => module.excludeAll(
ExclusionRule(organization = "org.apache.derby"))
case module => module
})
It doesn't use hacks. Just manual dependency resolution.
Adding this line in the test class to disable hive worked for me
override implicit def enableHiveSupport: Boolean = false
Got it here:
https://github.com/holdenk/spark-testing-base/issues/148