I have a simple spark function to test DF windowing:
import org.apache.spark.sql.{DataFrame, SparkSession}
object ScratchPad {
def main(args:
I have solved this problem by excluding wrong version of Derby and including right in build.sbt
:
project.settings(libraryDependencies ++= Seq(
"org.apache.derby" % "derby" % "10.11.1.1" % Test)
.map {
case module if module.name.contains("spark") => module.excludeAll(
ExclusionRule(organization = "org.apache.derby"))
case module => module
})
It doesn't use hacks. Just manual dependency resolution.