How to add “provided” dependencies back to run/test tasks' classpath?

前端 未结 4 1648
说谎
说谎 2020-11-28 22:22

Here\'s an example build.sbt:

import AssemblyKeys._

assemblySettings

buildInfoSettings

net.virtualvoid.sbt.graph.Plugin.graphSettings

name :         


        
相关标签:
4条回答
  • 2020-11-28 23:01

    If you use sbt-revolver plugin, here is a solution for its "reStart" task:

    fullClasspath in Revolver.reStart <<= fullClasspath in Compile

    UPD: for sbt-1.0 you may use the new assignment form:

    fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

    0 讨论(0)
  • 2020-11-28 23:04

    Adding to @douglaz' answer,

    runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
    

    is the corresponding fix for the runMain task.

    0 讨论(0)
  • 2020-11-28 23:08

    Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt asseblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.

    val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
    
    val commonSettings = Seq(
      name := "Project",
      libraryDependencies ++= Seq(...) // Common deps
    )
    
    // Project for running via spark-submit
    lazy val assemblyProj = (project in file("proj-dir"))
      .settings(
        commonSettings,
        assembly / mainClass := Some("com.example.Main"),
        libraryDependencies += sparkDep % "provided"
      )
    
    // Project for running via sbt with embedded spark
    lazy val runTestProj = (project in file("proj-dir"))
      .settings(
        // Projects' target dirs can't overlap
        target := target.value.toPath.resolveSibling("target-runtest").toFile,
        commonSettings,
        // If separate main file needed, e.g. for specifying spark master in code
        Compile / run / mainClass := Some("com.example.RunMain"),
        libraryDependencies += sparkDep
      )
    
    0 讨论(0)
  • 2020-11-28 23:14

    For a similar case I used in assembly.sbt:

    run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)) 
    

    and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.

    Update:

    @rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:

    run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
    runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
    
    0 讨论(0)
提交回复
热议问题