问题
I'm trying to test the Spark-HBase connector in the GCP context and tried to follow [1], which asks to locally package the connector [2] using Maven (I tried Maven 3.6.3) for Spark 2.4, and leads to following issue.
Error "branch-2.4":
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project shc-core: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: NullPointerException -> [Help 1]
References
[1] https://github.com/GoogleCloudPlatform/cloud-bigtable-examples/tree/master/scala/bigtable-shc
[2] https://github.com/hortonworks-spark/shc/tree/branch-2.4
回答1:
As suggested in the comments (thanks @Ismail !), using Java 8 works to build the connector:
sdk use java 8.0.275-zulu
mvn clean package -DskipTests
One can then import the jar in Dependencies.scala
of the GCP template as follows.
...
val shcCore = "com.hortonworks" % "shc-core" % "1.1.3-2.4-s_2.11" from "file:///<path_to_jar>/shc-core-1.1.3-2.4-s_2.11.jar"
...
// shcCore % (shcVersionPrefix + scalaBinaryVersion) excludeAll(
shcCore excludeAll(
...
来源:https://stackoverflow.com/questions/65429730/spark-hbase-gcp-template-1-3-how-to-locally-package-the-hortonworks-connec