I am new to scala and spark. Today I tried to write some code, and let it run on spark, but got an exception.
this code work in local scala
import or
Thanks @Imm, I have solved this issue. The root cause is that my local scala is 2.11.4, but my spark cluster is running at 1.2.0 version. The 1.2 version of spark was compiled by 2.10 scala.
So the solution is compile local code by 2.10 scala, and upload the compiled jar into spark. Everything works fine.