How should we address local dependencies in sbt files for Spark

坚强是说给别人听的谎言 提交于 2019-12-10 19:36:06

问题


I have this sbt file:

offline := true
name := "hello"
version := "1.0"
scalaVersion := "2.11.7-local"
scalaHome := Some(file("/home/ubuntu/software/scala-2.11.7"))
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

How can I tell it to use this address for Spark rather than using web?

/home/ubuntu/software/spark-1.5.0-bin-hadoop2.6

Because it just tries to connect to internet for Spark dependencies and my VM doesn't have internet access due to security issues.

I eventually want to run this simple code:

import org.apache.spark.SparkContext._
import org.apache.spark.api.java._
import org.apache.spark.api.java.function.Function_
import org.apache.spark.graphx._
import org.apache.spark.graphx.lib._
import org.apache.spark.graphx.PartitionStrategy._
//class PartBQ1{

object PartBQ1{
val conf = new SparkConf().setMaster("spark://10.0.1.31:7077")
             .setAppName("CS-838-Assignment2-Question2")
             .set("spark.driver.memory", "1g")
             .set("spark.eventLog.enabled", "true")
             .set("spark.eventLog.dir", "/home/ubuntu/storage/logs")
             .set("spark.executor.memory", "21g")
             .set("spark.executor.cores", "4")
             .set("spark.cores.max", "4")
             .set("spark.task.cpus", "1")

val sc = new SparkContext(conf=conf)
val sql_ctx = new SQLContext(sc)
val graph = GraphLoader.edgeListFile(sc, "data2.txt")
}

回答1:


I guess you could use something like (assuming spark is in your classpath)

run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)) 

As suggested in

https://stackoverflow.com/a/21803413/1706351

https://github.com/sbt/sbt-assembly#-provided-configuration




回答2:


These are the steps I took to resolve the issue with a new MVC 2 project and Spark 1.1:

1.Compile against MVC 2.0 - I double checked the references to make sure I was linking to MVC 2 and not MVC 1. Since this was a new project, this was not an issue. 2.Added System.Web.Mvc.Html - I added System.Web.Mvc.Html to the Spark configuration, to make sure that namespace was added to all views.

In Global.asax.cs Application_Start

var settings = new SparkSettings()
    .SetDebug(true)
    .SetAutomaticEncoding(true)
    .AddAssembly("Web")
    .AddNamespace("Web.Model")
    .AddNamespace("System.Collections.Generic")
    .AddNamespace("System.Linq")
    .AddNamespace("System.Web.Mvc")
    .AddNamespace("System.Web.Mvc.Html");

This can also be done in the webconfig in the Spark View Engine block.

3.Add the Typed Model - Make sure you type the Spark View Model. In aspx this is done with the Inherits in the page declaration, like this:

<%@ Page Title="" Language="C#" MasterPageFile="~/Views/Shared/Site.Master"
Inherits="System.Web.Mvc.ViewPage<MyModelType>" %>

in Spark:

<viewdata model="MyModelType" />


来源:https://stackoverflow.com/questions/33337536/how-should-we-address-local-dependencies-in-sbt-files-for-spark

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!