Error while using Hive context in spark : object hive is not a member of package org.apache.spark.sql

后端 未结 4 2279
忘掉有多难
忘掉有多难 2021-02-19 18:44

I am trying to construct a Hive Context ,which inherits from SQLContext.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

I get t

4条回答
  •  攒了一身酷
    2021-02-19 19:01

    For Maven projects, after you add the HIVE dependency, just click the "Update Project" by right-clicking on your project -> Maven -> Update Project. This should solve the issue.

提交回复
热议问题