问题
I have a SBT multi project setup outlined https://github.com/geoHeil/sf-sbt-multiproject-dependency-problem and want to be able to execute sbt console
in the root project.
When executing:
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().master("local[*]").enableHiveSupport.getOrCreate
spark.sql("CREATE database foo")
in the root console the error is:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.EmbeddedDriver
Strangely, it works just fine in the sub project:
sbt
project common
console
and now pasting the same code.
questions
- How can I fix sbt console to directly load the right dependencies?
- How can I load the console directly from the sub project? sbt common/console does not seem to fix the issue.
details
The most important settings below:
lazy val global = project
.in(file("."))
.settings(
settings,
libraryDependencies ++= commonDependencies
)
.aggregate(
common
)
.dependsOn(
common
)
lazy val common = project
.settings(
name := "common",
settings,
libraryDependencies ++= commonDependencies
)
lazy val dependencies =
new {
val sparkV = "2.3.0"
val sparkBase = "org.apache.spark" %% "spark-core" % sparkV % "provided"
val sparkSql = "org.apache.spark" %% "spark-sql" % sparkV % "provided"
val sparkHive = "org.apache.spark" %% "spark-hive" % sparkV % "provided"
}
lazy val commonDependencies = Seq(
dependencies.sparkBase,
dependencies.sparkHive,
dependencies.sparkSql
)
lazy val settings = commonSettings
lazy val commonSettings = Seq(
fork := true,
run in Compile := Defaults
.runTask(fullClasspath in Compile, mainClass.in(Compile, run), runner.in(Compile, run))
.evaluated
)
related questions
- Transitive dependency errors in SBT multi-project
- SBT test does not work for spark test
edit
The strange thing is: for spark version 2.2.0 this setup works just fine. Only 2.2.1 / 2.3.0 cause these problems, but work fine in a single project setup or when the console is started in the right project.
Also
java.security.AccessControlException: access denied org.apache.derby.security.SystemPermission( "engine", "usederbyinternals" )'
is mentioned in the stack trace.
回答1:
Actually SBT test does not work for spark test using the code of:
if (appName === "dev") {
System.setSecurityManager(null)
}
is fixing it for development.
https://github.com/holdenk/spark-testing-base/issues/148 https://issues.apache.org/jira/browse/SPARK-22918
来源:https://stackoverflow.com/questions/49608764/loading-the-right-dependencies-for-sbt-console-in-multi-project-setup-causing-de