How to pre-package external libraries when using Spark on a Mesos cluster
问题 According to the Spark on Mesos docs one needs to set the spark.executor.uri pointing to a Spark distribution: val conf = new SparkConf() .setMaster("mesos://HOST:5050") .setAppName("My app") .set("spark.executor.uri", "<path to spark-1.4.1.tar.gz uploaded above>") The docs also note that one can build a custom version of the Spark distribution. My question now is whether it is possible/desirable to pre-package external libraries such as spark-streaming-kafka elasticsearch-spark spark-csv