How to run Spark application assembled with Spark 2.1 on cluster with Spark 1.6?
问题 I've been told that I could build a Spark application with one version of Spark and, as long as I use sbt assembly to build that, than I can run it with spark-submit on any spark cluster. So, I've build my simple application with Spark 2.1.1. You can see my build.sbt file below. Than I'm starting this on my cluster with: cd spark-1.6.0-bin-hadoop2.6/bin/ spark-submit --class App --master local[*] /home/oracle/spark_test/db-synchronizer.jar So as you see I'm executing it with spark 1.6.0. and