Is it possible to use json4s 3.2.11 with Spark 1.3.0?

谁说我不能喝 提交于 2019-11-27 07:02:26

问题


Spark has a dependency on json4s 3.2.10, but this version has several bugs and I need to use 3.2.11. I added json4s-native 3.2.11 dependency to build.sbt and everything compiled fine. But when I spark-submit my JAR it provides me with 3.2.10.

build.sbt

import sbt.Keys._

name := "sparkapp"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core"  % "1.3.0" % "provided"

libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.11"`

plugins.sbt

logLevel := Level.Warn

resolvers += Resolver.url("artifactory", url("http://scalasbt.artifactoryonline.com/scalasbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

App1.scala

import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark.{Logging, SparkConf, SparkContext}
import org.apache.spark.SparkContext._

object App1 extends Logging {
  def main(args: Array[String]) = {
    val conf = new SparkConf().setAppName("App1")
    val sc = new SparkContext(conf)
    println(s"json4s version: ${org.json4s.BuildInfo.version.toString}")
  }
}

sbt 0.13.7 + sbt-assembly 0.13.0 Scala 2.10.4

Is there a way to force 3.2.11 version usage?


回答1:


We ran into a problem similar to the one Necro describes, but downgrading from 3.2.11 to 3.2.10 when building the assembly jar did not resolve it. We ended up solving it (using Spark 1.3.1) by shading the 3.2.11 version in the job assembly jar:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("org.json4s.**" -> "shaded.json4s.@1").inAll
)



回答2:


I asked the same question in the Spark User Mailing List, and got two answers how to make it work:

  1. Use spark.driver.userClassPathFirst=true and spark.executor.userClassPathFirst=true options, but it works only in Spark 1.3 and probably will require some other modifications like excluding Scala classes from your build.

  2. Rebuild Spark with json4s 3.2.11 version (you can change it in core/pom.xml)

Both work fine, I prefered the second one.




回答3:


This is not an answer to your question but this came up when searching for my problem. I was getting a NoSuchMethod exception in formats.emptyValueStrategy.replaceEmpty(value) in json4s's 'render'. The reason was I was building with 3.2.11 but Spark was linking 3.2.10. I downgraded to 3.2.10 and my problem went away. Your question helped me understand what was going on (that Spark was linking a conflicting version of json4s) and I was able to resolve the problem, so thanks.



来源:https://stackoverflow.com/questions/29216712/is-it-possible-to-use-json4s-3-2-11-with-spark-1-3-0

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!