How to pre-package external libraries when using Spark on a Mesos cluster
According to the Spark on Mesos docs one needs to set the spark.executor.uri pointing to a Spark distribution: val conf = new SparkConf() .setMaster("mesos://HOST:5050") .setAppName("My app") .set("spark.executor.uri", "<path to spark-1.4.1.tar.gz uploaded above>") The docs also note that one can build a custom version of the Spark distribution. My question now is whether it is possible/desirable to pre-package external libraries such as spark-streaming-kafka elasticsearch-spark spark-csv which will be used in mostly all of the job-jars I'll submit via spark-submit to reduce the time sbt