Error while using Hive context in spark : object hive is not a member of package org.apache.spark.sql

后端 未结 4 2216
野的像风
野的像风 2021-02-19 18:14

I am trying to construct a Hive Context ,which inherits from SQLContext.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

I get t

4条回答
  •  清歌不尽
    2021-02-19 18:48

    For Maven projects, after you add the HIVE dependency, just click the "Update Project" by right-clicking on your project -> Maven -> Update Project. This should solve the issue.

提交回复
热议问题