How to test Kafka Streams applications with Spring Kafka?

白昼怎懂夜的黑 提交于 2020-12-07 05:15:51

问题


I am writing a streaming application with Kafka Streams, Spring-Kafka and Spring Boot. I cannot find any information how to properly test stream processing done by Kafka Streams DSL while using Spring-Kafka. Documentation mentions EmbeddedKafkaBroker but there seems to be no information on how to handle testing for example state stores.

Just to provide some simple example of what I would like to test. I have a following bean registered (where Item is avro generated):


    @Bean
    public KTable<String, Long> itemTotalKTable(StreamsBuilder streamsBuilder) {
        return streamsBuilder
                .stream(ITEM_TOPIC,
                        Consumed.with(Serdes.String(), itemAvroSerde))
                .mapValues((id, item) -> item.getNumber())
                .groupByKey()
                .aggregate(
                        () -> 0L,
                        (id, number, agg) -> agg + number,
                        Materialized.with(Serdes.String(), Serdes.Long()));
    }

What is a proper way to test that all item numbers are aggregated?


回答1:


Spring Kafka for Kafka Streams support doesn't bring any extra API, especially in streams building and their processing.

We have opened recently for ourselves that there is a good kafka-streams-test-utils library to be used in unit tests without any Kafka broker start (even embedded).

In several our tests we have something like this:

    KStream<String, String> stream = builder.stream(INPUT);
    stream
            .transform(() -> enricher)
            .to(OUTPUT);

    Properties config = new Properties();
    config.put(StreamsConfig.APPLICATION_ID_CONFIG, "test");
    config.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9999");
    TopologyTestDriver driver = new TopologyTestDriver(builder.build(), config);

    ConsumerRecordFactory<String, String> recordFactory = new ConsumerRecordFactory<>(new StringSerializer(),
            new StringSerializer());
    driver.pipeInput(recordFactory.create(INPUT, "key", "value"));
    ProducerRecord<byte[], byte[]> result = driver.readOutput(OUTPUT);
    assertThat(result.headers().lastHeader("foo")).isNotNull();

I believe there should some API in that TopologyTestDriver to deal with the mentioned state store.




回答2:


Maybe you could make a method which takes your KTable as a param and calls .toStream().to(topicname,Produced.with(keyserde, valueserde)) on it, then you could do the following:

MyTopologyBuilder builder = new MyTopologyBuilder();

testDriver = new TopologyTestDriver(builder.build(), config);

ConsumerRecord<byte[], byte[]> input = createStepRecord(key, record);

testDriver.pipeInput(input);

ProducerRecord<String, String> out testDriver.readOutput(topic, new StringDeserializer(), ew AvroDeserializer<>(MyClass.class);

assertThat(out.key(), is(key));
assertEquals(myPredefinedValue, out.value());
assertEquals(5, out.value().getMyList().size());

This should work, but there could be more elegant ways I guess.



来源:https://stackoverflow.com/questions/57737640/how-to-test-kafka-streams-applications-with-spring-kafka

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!