Error while using Hive context in spark : object hive is not a member of package org.apache.spark.sql

后端 未结 4 2280
忘掉有多难
忘掉有多难 2021-02-19 18:44

I am trying to construct a Hive Context ,which inherits from SQLContext.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

I get t

4条回答
  •  南笙
    南笙 (楼主)
    2021-02-19 18:50

    Using sbt:

    You have to include spark-hive in your dependencies.

    To do so add the following line in your .sbt file:

    libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.0"

提交回复
热议问题