scala code throw exception in spark

后端 未结 1 1139
北荒
北荒 2021-01-18 00:59

I am new to scala and spark. Today I tried to write some code, and let it run on spark, but got an exception.

this code work in local scala

import or         


        
相关标签:
1条回答
  • 2021-01-18 01:15

    Thanks @Imm, I have solved this issue. The root cause is that my local scala is 2.11.4, but my spark cluster is running at 1.2.0 version. The 1.2 version of spark was compiled by 2.10 scala.

    So the solution is compile local code by 2.10 scala, and upload the compiled jar into spark. Everything works fine.

    0 讨论(0)
提交回复
热议问题