Spark streaming StreamingContext.start() - Error starting receiver 0

混江龙づ霸主 提交于 2019-12-31 03:32:26

问题


I have a project that's using spark streaming and I'm running it with 'spark-submit' but I'm hitting this error:

15/01/14 10:34:18 ERROR ReceiverTracker: Deregistered receiver for stream 0: Error starting receiver 0 - java.lang.AbstractMethodError
    at org.apache.spark.Logging$class.log(Logging.scala:52)
    at org.apache.spark.streaming.kafka.KafkaReceiver.log(KafkaInputDStream.scala:66)
    at org.apache.spark.Logging$class.logInfo(Logging.scala:59)
    at org.apache.spark.streaming.kafka.KafkaReceiver.logInfo(KafkaInputDStream.scala:66)
    at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:86)
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.startReceiver(ReceiverSupervisor.scala:121)
    at org.apache.spark.streaming.receiver.ReceiverSupervisor.start(ReceiverSupervisor.scala:106)
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:264)
    at org.apache.spark.streaming.scheduler.ReceiverTracker$ReceiverLauncher$$anonfun$9.apply(ReceiverTracker.scala:257)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
    at org.apache.spark.scheduler.Task.run(Task.scala:54)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

This is the code that the error is coming from, everything runs fine up until ssc.start()

    val Array(zkQuorum, group, topics, numThreads) = args
    val sparkConf = new SparkConf().setAppName("Jumbly_StreamingConsumer")
    val ssc = new StreamingContext(sparkConf, Seconds(2))
    ssc.checkpoint("checkpoint")
    .
    .
    .
    ssc.start()
    ssc.awaitTermination()

I've run the SparkPi example using 'spark-submit' and it runs fine so I can't seem to figure out what's causing the problem on my application, any help would be really appreciated.


回答1:


From the documentation of java.lang.AbstractMethod:

Normally, this error is caught by the compiler; this error can only occur at run time if the definition of some class has incompatibly changed since the currently executing method was last compiled.

This means that there's a version incompatibility between the compile and runtime dependencies. Make sure you align those versions to solve this issue.



来源:https://stackoverflow.com/questions/27941762/spark-streaming-streamingcontext-start-error-starting-receiver-0

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!