问题
A couple of days ago we upgraded from DSE 4.6 to DSE 4.7. We're running Spark jobs from Java so I've upgraded spark-core_2.10
Maven dependency from 0.9.1
to 1.2.2
to support the newer Spark 1.2.2 that DSE bundles with. However, when I'm now submitting jobs to master it logs
ERROR [sparkMaster-akka.actor.default-dispatcher-2] 2015-11-17 17:50:42,442 Slf4jLogger.scala:66 - org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2812534333379744973
java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 7674242335164700840, local class serialVersionUID = 2812534333379744973
[...]
(see full exception here: https://gist.github.com/JensRantil/371d00701ff7f2021d7d)
Digging further into this, it turns out that DSE ships with spark-core_2.10-1.2.2.2.jar
which is
- not AFAIK published as a Maven artefact on the Interwebz.
- not tagged as released in GIT repository in https://github.com/apache/spark.
- shipping with a different
serialVersionUID
oforg.apache.spark.deploy.ApplicationDescription
thanspark-core_2.10-1.2.2.jar
(1.2.2
, that is). I have verified this usingserialver
.
Question: The Datastax Java Spark documentation never mentions anything about importing Spark Core. Should I be able to use the standard Spark Java Maven artifact with DSE?
Additionals:
- I have added
dse.jar
to the classpath and that's not the issue. - The Datastax Java Spark documentation doesn't state which Spark Code version to use. See related What do I need to import to make `SparkConf` resolvable?.
- The
ApplicationDescription
class doesn't fixate itsserialVersionUID
. Decompiling theApplicationDescription.class
for1.2.2
and1.2.2.2
, it looks like they both use the same number of fields but have different methods, which leads toserialVersionUID
being generated differently. One potential upstream DSE fix would be to simply fixate theserialVersionUID
to7674242335164700840
. - Somewhat related; To make things even more complicated we have a repackaged
spark-core_2.10-1.2.2.jar
to shadow a package that broke our application. However, we could probably use the Maven shade plugin for this even if we don't have source code for1.2.2.2
.
来源:https://stackoverflow.com/questions/33782857/can-i-use-regular-spark-core-application-library-to-talk-to-dse-spark