Error while using Hive context in spark : object hive is not a member of package org.apache.spark.sql

后端 未结 4 2210
野的像风
野的像风 2021-02-19 18:14

I am trying to construct a Hive Context ,which inherits from SQLContext.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

I get t

4条回答
  •  温柔的废话
    2021-02-19 18:56

    Using sbt:

    You have to include spark-hive in your dependencies.

    To do so add the following line in your .sbt file:

    libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.0"

提交回复
热议问题