Spark application throws javax.servlet.FilterRegistration

后端 未结 7 1024
别跟我提以往
别跟我提以往 2020-12-30 01:57

I\'m using Scala to create and run a Spark application locally.

My build.sbt:

name : \"SparkDemo\"
version : \"1.0\"
scalaVersion : \"2.10.4\"
librar         


        
相关标签:
7条回答
  • 2020-12-30 02:34

    See my answer to a similar question here. The class conflict comes about because HBase depends on org.mortbay.jetty, and Spark depends on org.eclipse.jetty. I was able to resolve the issue by excluding org.mortbay.jetty dependencies from HBase.

    If you're pulling in hadoop-common, then you may also need to exclude javax.servlet from hadoop-common. I have a working HBase/Spark setup with my sbt dependencies set up as follows:

    val clouderaVersion = "cdh5.2.0"
    val hadoopVersion = s"2.5.0-$clouderaVersion"
    val hbaseVersion = s"0.98.6-$clouderaVersion"
    val sparkVersion = s"1.1.0-$clouderaVersion"
    
    val hadoopCommon = "org.apache.hadoop" % "hadoop-common" % hadoopVersion % "provided" excludeAll ExclusionRule(organization = "javax.servlet")
    val hbaseCommon = "org.apache.hbase" % "hbase-common" % hbaseVersion % "provided"
    val hbaseClient = "org.apache.hbase" % "hbase-client" % hbaseVersion % "provided"
    val hbaseProtocol = "org.apache.hbase" % "hbase-protocol" % hbaseVersion % "provided"
    val hbaseHadoop2Compat = "org.apache.hbase" % "hbase-hadoop2-compat" % hbaseVersion % "provided"
    val hbaseServer = "org.apache.hbase" % "hbase-server" % hbaseVersion % "provided" excludeAll ExclusionRule(organization = "org.mortbay.jetty")
    val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
    val sparkStreaming = "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided"
    val sparkStreamingKafka = "org.apache.spark" %% "spark-streaming-kafka" % sparkVersion exclude("org.apache.spark", "spark-streaming_2.10")
    
    0 讨论(0)
  • 2020-12-30 02:37

    If you are using IntelliJ IDEA, try this:

    1. Right click the project root folder, choose Open Module Settings
    2. In the new window, choose Modules in the left navigation column
    3. In the column rightmost, select Dependencies tab, find Maven: javax.servlet:servlet-api:2.5
    4. Finally, just move this item to the bottom by pressing ALT+Down.

    It should solve this problem.

    This method came from http://wpcertification.blogspot.ru/2016/01/spark-error-class-javaxservletfilterreg.html

    0 讨论(0)
  • 2020-12-30 02:37

    When you use SBT, FilterRegistration class is present in 3.0 and also if you use JETTY Or Java 8 this JAR 2.5 it automatically adds as dependency,

    Fix: Servlet-api-2.5 JAR was the mess there, I resolved this issue by adding servlet-api-3.0 jar in dependencies,

    0 讨论(0)
  • 2020-12-30 02:41

    If it is happening in Intellij Idea you should go to the project setting and find the jar in the modules, and remove it. Then run your code with sbt through shell. It will get the jar files itself, and then go back to intellij and re-run the code through intellij. It somehow works for me and fixes the error. I am not sure what was the problem since it doesn't show up anymore.

    Oh, I also removed the jar file, and added "javax.servlet:javax.servlet-api:3.1.0" through maven by hand and now I can see the error gone.

    0 讨论(0)
  • 2020-12-30 02:41

    For me works the following:

    libraryDependencies ++= Seq(
        "org.apache.spark" %% "spark-core" % sparkVersion.value % "provided",
        "org.apache.spark" %% "spark-sql"  % sparkVersion.value % "provided",
        ....................................................................
    ).map(_.excludeAll(ExclusionRule(organization = "javax.servlet")))
    
    0 讨论(0)
  • 2020-12-30 02:42

    try running a simple program without the hadoop and hbase dependency

    libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"     excludeAll(ExclusionRule(organization = "org.eclipse.jetty"))
    
    libraryDependencies += "org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.6.0"
    
    
    libraryDependencies += "org.apache.hbase" % "hbase-client" % "0.98.4-hadoop2"
    
    libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.4-hadoop2"
    
    libraryDependencies += "org.apache.hbase" % "hbase-common" % "0.98.4-hadoop2"
    

    There should be mismatch of the dependencies. also make sure you have same version of jars while you compile and while you run.

    Also is it possible to run the code on spark shell to reproduce ? I will be able to help better.

    0 讨论(0)
提交回复
热议问题