At the Spark 2.1 docs it\'s mentioned that
Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark 2.1.0 uses Scala 2.11. You wi
To add to the answer, I believe it is a typo https://spark.apache.org/releases/spark-release-2-0-0.html has no mention of scala 2.12.
Also, if we look at timings Scala 2.12 was not released untill November 2016 and Spark 2.0.0 was released on July 2016.
References: https://spark.apache.org/news/index.html
www.scala-lang.org/news/2.12.0/
Spark does not support Scala 2.12. You can follow SPARK-14220 (Build and test Spark against Scala 2.12) to get up to date status.
update: Spark 2.4 added an experimental Scala 2.12 support.
Scala 2.12 is officially supported (and required) as of Spark 3. Summary:
Using a Spark runtime that's compiled with one Scala version and a JAR file that's compiled with another Scala version is dangerous and causes strange bugs. For example, as noted here, using a Scala 2.11 compiled JAR on a Spark 3 cluster will cause this error: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps
.
Look at all the poor Spark users running into this very error.
Make sure to look into Scala cross compilation and understand the %%
operator in SBT to limit your suffering. Maintaining Scala projects is hard and minimizing your dependencies is recommended.