I wrote a code for word count but when I tried running it from CMD in windows using below command it throws an exception.
spark-submit --class com.sample.WordCou
The other answers are correct.
To add to them, don't forget to update the jar path in the spark-submit
command when you change Scala versions. So if you're using sbt it's:
sbt package
target/scala-2.12/word-count-app_2.12-1.0.jar
to target/scala-2.11/word-count-app_2.11-1.0.jar
.looks like you are using 2.4.x with Scala 2.12. It might be compatibility issue. Spark documentation reference:- Spark runs on Java 8+, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).
I had the same issue, and solved it by changing the version of Scala I use during development to match the version Spark came with.
When I start Spark with ./spark-shell
, it says Using Scala version 2.11.12
,
So I changed the Scala version in the build.sbt from 2.12.8
to 2.11.12
and everything worked.
I use Spark version 2.4.3
.