Error while using Hive context in spark : object hive is not a member of package org.apache.spark.sql

后端 未结 4 2278
忘掉有多难
忘掉有多难 2021-02-19 18:44

I am trying to construct a Hive Context ,which inherits from SQLContext.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

I get t

4条回答
  •  一生所求
    2021-02-19 18:53

    Because of hive's dependencies it is not compiled into the spark binary by default you have to build it yourself. Quote from the website

    However, since Hive has a large number of dependencies, it is not included in the default Spark assembly. In order to use Hive you must first run sbt/sbt -Phive assembly/assembly (or use -Phive for maven).

提交回复
热议问题