问题
I want to add GeoSpark library to Apache Spark. How do I add GeoSpark library from Spark shell?
回答1:
$ ./bin/spark-shell --master local[4] --jars code.jar
--jars option will distribute your local custom jar to cluster automatically.
来源:https://stackoverflow.com/questions/34367085/how-to-add-our-custom-library-to-apache-spark