问题
I'm trying to write a sample program in Scala/Spark/H2O. The program compiles, but throws an exception in H2OContext.getOrCreate
:
object App1 extends App{
val conf = new SparkConf()
conf.setAppName("AppTest")
conf.setMaster("local[1]")
conf.set("spark.executor.memory","1g");
val sc = new SparkContext(conf)
val spark = SparkSession.builder
.master("local")
.appName("ApplicationController")
.getOrCreate()
import spark.implicits._
val h2oContext = H2OContext.getOrCreate(sess) // <--- error here
import h2oContext.implicits._
val rawData = sc.textFile("c:\\spark\\data.csv")
val data = rawData.map(line => line.split(',').map(_.toDouble))
val response: RDD[Int] = data.map(row => row(0).toInt)
val str = "count: " + response.count()
val h2oResponse: H2OFrame = response.toDF
sc.stop
spark.stop
}
This is the exception log:
Exception in thread "main" java.lang.RuntimeException: When using the Sparkling Water as Spark package via --packages option, the 'no.priv.garshol.duke:duke:1.2' dependency has to be specified explicitly due to a bug in Spark dependency resolution. at org.apache.spark.h2o.H2OContext.init(H2OContext.scala:117)
来源:https://stackoverflow.com/questions/49368280/h2o-fails-on-h2ocontext-getorcreate