ClassCastException in spring-kafka-test using `merger()`

我怕爱的太早我们不能终老 提交于 2020-05-16 06:27:09

问题


I want to test my Kafka Streams topology with a unit test using kafka-streams-test-utils. I'm using this library already a longer time and I built already some abstract layer around my tests using TestNG. But since I added a merge(...) to my Stream, I got the following Exception:

 org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000001, topic=my-topic-2, partition=0, offset=0
 at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:318)
at org.apache.kafka.streams.TopologyTestDriver.pipeInput(TopologyTestDriver.java:393)
Caused by: org.apache.kafka.streams.errors.StreamsException: A serializer (key: org.apache.kafka.common.serialization.ByteArraySerializer / value: org.apache.kafka.common.serialization.ByteArraySerializer) is not compatible to the actual key or value type (key type: com.MyKey / value type: com.MyValue). Change the default Serdes in StreamConfig or provide correct Serdes via method parameters.
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:94)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
at org.apache.kafka.streams.kstream.internals.KStreamFlatMap$KStreamFlatMapProcessor.process(KStreamFlatMap.java:42)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:50)
at org.apache.kafka.streams.processor.internals.ProcessorNode.runAndMeasureLatency(ProcessorNode.java:244)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:133)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:143)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:126)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:90)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:87)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:302)
... 3 more
Caused by: java.lang.ClassCastException: class com.MyKey cannot be cast to class [B (com.MyValue is in unnamed module of loader 'app'; [B is in module java.base of loader 'bootstrap')
at org.apache.kafka.common.serialization.ByteArraySerializer.serialize(ByteArraySerializer.java:21)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:156)
at org.apache.kafka.streams.processor.internals.RecordCollectorImpl.send(RecordCollectorImpl.java:101)
at org.apache.kafka.streams.processor.internals.SinkNode.process(SinkNode.java:89)
... 15 more

Here is the part how I build the Stream with the StreamBuilder of the TopologyTestDriver:

// Block 1
KStream<MyKey, MyValue> stream2 = streamsBuilder.stream(
    "my-topic-2",
    consumedAs(OtherKey.class, OtherValue.class, AllowEmpty.NONE) // Provides default json Serde
).flatMap(
    (key, value) -> {
        List<KeyValue<MyKey, MyValue>> list = new ArrayList<>();
        // Do stuff an fill out the list
        return list;
    })
 .through("tmp-topic");

// Block 2
KStream<MyKey, MyValue>[] branches = stream1
    .merge(stream2)
    ... business stuff

For producing messages on the source topic, I'm using TopologyTestDriver.pipeInput(...) initialized with JsonSerDes. The Exception happens by casting the ByteArray, but I don't know why the expected parameter of the ByteArraySerializer is the same class but from another module than the consumed class loaded. They might also loaded by another ClassLoaders. But there is no Spring stack in the background and everything should runs synchronous.

I'm really confused about this behavior.

Apache Kafka Dependecies have the version: 2.0.1 and I'm using openjdk-11. Is it possible to align the classloading of the serializers? The error occurs only, if I produce something on: my-topic-2, the other topic of the merge works fine.


回答1:


As mentioned by @bbejeck, you would need to use a different version of .through(), the one that allows you to override default (ByteArraySerde) serdes applied to K, V.

KStream<K,V> through​(java.lang.String topic,
                     Produced<K,V> produced) 

Materialize this stream to a topic and creates a new KStream from the topic using the Produced instance for configuration of the key serde, value serde, and StreamPartitioner. ... This is equivalent to calling to(someTopic, Produced.with(keySerde, valueSerde) and StreamsBuilder#stream(someTopicName, Consumed.with(keySerde, valueSerde)).




回答2:


Without seeing all of your code, I can't say for sure, but here's what I think could be happening.

Providing Serdes with Consumed only provide de/serialization when consuming the records from the input topic; Kafka Streams doesn't propagate them through the rest of the topology. At any point, if a Serde is required again, Kafka Streams uses the ones provided in the StreamsConfig. The Serdes.ByteArraySerde is the default value.

I would suggest two things to try:

  1. Use Produced.with(keySerde, valueSerde) in your sink nodes
  2. Provide the Serde for your type via the StreamsConfig.

HTH, and let me know how things work out.

-Bill



来源:https://stackoverflow.com/questions/61341961/classcastexception-in-spring-kafka-test-using-merger

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!