Passing additional jars to Spark via spark-submit

↘锁芯ラ 提交于 2019-11-27 16:25:31

The problem is that CLASSPATH should be colon separated, while JARS should be comma separated:

$SPARK_HOME/bin/spark-submit \
--driver-class-path /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar \
--jars /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar,/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar my_application.py

Adding on top of Zero323 answer

I think Better way of doing this is

$SPARK_HOME/bin/spark-submit \
--driver-class-path  $(echo /usr/local/share/mongo-hadoop/build/libs/*.jar | tr ' ' ',') \
--jars $(echo /usr/local/share/mongo-hadoop/build/libs/*.jar | tr ' ' ',') my_application.py

in this approach, you wont miss any jar by mistake in the classpath hence no warning should come.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!