Error while using Hive context in spark : object hive is not a member of package org.apache.spark.sql

后端 未结 4 2248
野的像风
野的像风 2021-02-19 18:14

I am trying to construct a Hive Context ,which inherits from SQLContext.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

I get t

相关标签:
4条回答
  • 2021-02-19 18:48

    For Maven projects, after you add the HIVE dependency, just click the "Update Project" by right-clicking on your project -> Maven -> Update Project. This should solve the issue.

    0 讨论(0)
  • 2021-02-19 18:51

    Because of hive's dependencies it is not compiled into the spark binary by default you have to build it yourself. Quote from the website

    However, since Hive has a large number of dependencies, it is not included in the default Spark assembly. In order to use Hive you must first run sbt/sbt -Phive assembly/assembly (or use -Phive for maven).

    0 讨论(0)
  • 2021-02-19 18:56

    Using sbt:

    You have to include spark-hive in your dependencies.

    To do so add the following line in your .sbt file:

    libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.0"

    0 讨论(0)
  • 2021-02-19 18:56

    Here's an example maven dependency

    <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_${scala.tools.version}</artifactId>
            <version>${spark.version}</version>
    </dependency>
    

    --- for those who need to know how to set the properties in POM, below is an example

    <properties>
            <maven.compiler.source>1.7</maven.compiler.source>
            <maven.compiler.target>1.7</maven.compiler.target>
            <encoding>UTF-8</encoding>
            <scala.tools.version>2.10</scala.tools.version>
            <scala.version>2.10.4</scala.version>
            <spark.version>1.5.0</spark.version>
    </properties>
    
    0 讨论(0)
提交回复
热议问题