Exclusion of dependency of spark-core in CDH

后端 未结 1 1463
北荒
北荒 2021-01-28 21:15

I\'m using Structured Spark Streaming to write to HBase data coming from Kafka.

My cluster distribution is : Hadoop 3.0.0-cdh6.2.0, and i\'m using Spark 2.4.0

My

相关标签:
1条回答
  • 2021-01-28 21:53

    To solve this, do not use spark.driver.userClassPathFirst and spark.executor.userClassPathFirst but intstead, use spark.driver.extraClassPath and spark.executor.extraClassPath.

    Definition from the official documentation : "Extra classpath entries to prepend to the classpath of the driver."

    • "prepend", as in, put in front of Spark’s core classpath.

    Example :

    --conf spark.driver.extraClassPath=C:\Users\Khalid\Documents\Projects\libs\jackson-annotations-2.6.0.jar;C:\Users\Khalid\Documents\Projects\libs\jackson-core-2.6.0.jar;C:\Users\Khalid\Documents\Projects\libs\jackson-databind-2.6.0.jar

    This solved my problem (conflict between the version of Jackson i want to use, and the one spark is using).

    Hope it helps.

    0 讨论(0)
提交回复
热议问题