I am trying to construct a Hive Context ,which inherits from SQLContext.
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
I get t
For Maven projects, after you add the HIVE dependency, just click the "Update Project" by right-clicking on your project -> Maven -> Update Project. This should solve the issue.
Because of hive's dependencies it is not compiled into the spark binary by default you have to build it yourself. Quote from the website
However, since Hive has a large number of dependencies, it is not included in the default Spark assembly. In order to use Hive you must first run sbt/sbt -Phive assembly/assembly
(or use -Phive for maven).
Using sbt:
You have to include spark-hive in your dependencies.
To do so add the following line in your .sbt file:
libraryDependencies += "org.apache.spark" %% "spark-hive" % "1.5.0"
Here's an example maven dependency
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.tools.version}</artifactId>
<version>${spark.version}</version>
</dependency>
--- for those who need to know how to set the properties in POM, below is an example
<properties>
<maven.compiler.source>1.7</maven.compiler.source>
<maven.compiler.target>1.7</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.tools.version>2.10</scala.tools.version>
<scala.version>2.10.4</scala.version>
<spark.version>1.5.0</spark.version>
</properties>