问题
I am able to run storm Kafka with local cluster but not able to run with storm Submitter below is my topology code
can anyone please help me to solve this issue :)
package com.org.kafka;
import org.apache.storm.Config;
import org.apache.storm.LocalCluster;
import org.apache.storm.generated.AlreadyAliveException;
import org.apache.storm.generated.AuthorizationException;
import org.apache.storm.generated.InvalidTopologyException;
import org.apache.storm.kafka.KafkaSpout;
import org.apache.storm.kafka.SpoutConfig;
import org.apache.storm.kafka.StringScheme;
import org.apache.storm.kafka.ZkHosts;
import org.apache.storm.spout.SchemeAsMultiScheme;
import org.apache.storm.topology.TopologyBuilder;
import kafka.api.OffsetRequest;
public class KafkaTopology {
public static void main(String[] args)
throws AlreadyAliveException, InvalidTopologyException,
AuthorizationException {
ZkHosts zkHosts = new ZkHosts("localhost:2181");
SpoutConfig kafkaConfig = new SpoutConfig(zkHosts, "secondTest", "", "id7");
kafkaConfig.scheme = new SchemeAsMultiScheme(new StringScheme());
kafkaConfig.startOffsetTime = OffsetRequest.EarliestTime();
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("KafkaSpout", new KafkaSpout(kafkaConfig), 1);
builder.setBolt("Sentence-bolt", new SentenceBolt(), 1).globalGrouping("KafkaSpout");
builder.setBolt("PrinterBolt", new PrinterBolt(), 1).globalGrouping("SentenceBolt");
LocalCluster cluster = new LocalCluster();
Config conf = new Config();
StormSubmitter.submitTopology("KafkaStormToplogy", conf, builder.createTopology());
try {
System.out.println("Waiting to consume from kafka");
Thread.sleep(10000);
}
catch (Exception exception) {
System.out.println("Thread interrupted exception : " + exception);
}
cluster.killTopology("KafkaToplogy");
cluster.shutdown();
}
}
I am getting an below exception found in worker.log file.
but when I look into terminal it is showing Finished submitting topology:KafkaStormToplogy
2018-01-24 11:58:38.941 o.a.s.d.worker main [ERROR] Error on initialization of server mk-worker
java.lang.RuntimeException: java.io.InvalidClassException: org.apache.storm.kafka.SpoutConfig; local class incompatible: stream classdesc serialVersionUID = -1247769246497567352, local class serialVersionUID = 6814635004761021338
at org.apache.storm.utils.Utils.javaDeserialize(Utils.java:254) ~[storm-core-1.0.5.jar:1.0.5]
at org.apache.storm.utils.Utils.getSetComponentObject(Utils.java:504) ~[storm-core-1.0.5.jar:1.0.5]
at org.apache.storm.daemon.task$get_task_object.invoke(task.clj:74) ~[storm-core-1.0.5.jar:1.0.5]
at org.apache.storm.daemon.task$mk_task_data$fn__4609.invoke(task.clj:177) ~[storm-core-1.0.5.jar:1.0.5]
at org.apache.storm.util$assoc_apply_self.invoke(util.clj:931) ~[storm-core-1.0.5.jar:1.0.5]
at org.apache.storm.daemon.task$mk_task_data.invoke(task.clj:170) ~[storm-core-1.0.5.jar:1.0.5]
at org.apache.storm.daemon.task$mk_task.invoke(task.clj:181) ~[storm-core-1.0.5.jar:1.0.5]
at org.apache.storm.daemon.executor$mk_executor$fn__4830.invoke(executor.clj:371) ~[storm-core-1.0.5.jar:1.0.5]
at clojure.core$map$fn__4553.invoke(core.clj:2622) ~[clojure-1.7.0.jar:?]
at clojure.lang.LazySeq.sval(LazySeq.java:40) ~[clojure-1.7.0.jar:?]
at clojure.lang.LazySeq.seq(LazySeq.java:49) ~[clojure-1.7.0.jar:?]
at clojure.lang.RT.seq(RT.java:507) ~[clojure-1.7.0.jar:?]
at clojure.core$seq__4128.invoke(core.clj:137) ~[clojure-1.7.0.jar:?]
at clojure.core.protocols$seq_reduce.invoke(protocols.clj:30) ~[clojure-1.7.0.jar:?]
at clojure.core.protocols$fn__6506.invoke(protocols.clj:101) ~[clojure-1.7.0.jar:?]
回答1:
I think this is either because you have different versions of storm-kafka on your Nimbus classpath vs your worker classpath, or because you're running Nimbus and the worker on different JDKs. SpoutConfig (https://github.com/apache/storm/blob/1.x-branch/external/storm-kafka/src/jvm/org/apache/storm/kafka/SpoutConfig.java) ought to declare a serialVersionUID, but it doesn't. See for reference https://stackoverflow.com/a/285809/8845188. As I understand it, the serialVersionUID is generated at runtime by the JVM, and different JDKs may generate different numbers for the same class.
I would clone storm-kafka and add the missing serialVersionUID field to SpoutConfig, build storm-kafka and try again. I've raised https://issues.apache.org/jira/browse/STORM-2911 to track fixing it. You're welcome to take a look at it.
来源:https://stackoverflow.com/questions/48420085/kafkaspout-is-not-emitting-messages