apache-kafka-streams

global state store don't create change-log topic what is the workaround if input topic to global store has null key?

你。 提交于 2020-05-17 06:22:10
问题 I read lot about global state store that it does not create change-topic topic for restore instead it use the source topic as restore. i am create custom key and store the data in global state store, but after restart it will gone because global store on restore will directly take data from source topic and bypass the processor. my input topic has above data. { "id": "user-12345", "user_client": [ "clientid-1", "clientid-2" ] } i am maintaining two state store as follow: id ->record (record

Consumer group stuck in 'rebalancing' even though there are no consumers

我与影子孤独终老i 提交于 2020-05-16 21:59:14
问题 I am using kafka version 2.4.1(recently upgraded to 2.4.1 from 2.2.0) and noticed a strange problem. Even though application(kafka streams) is down (there is no application which is running ) but the consumer group command returns the state as rebalancing. Our application runs as kubernetes pod. root@bastion-0:# ./kafka-consumer-groups --describe --group groupname --bootstrap-server kafka-0.local:9094 Warning: Consumer group 'groupname' is rebalancing. I have waited for some amount of time

ClassCastException in spring-kafka-test using `merger()`

我怕爱的太早我们不能终老 提交于 2020-05-16 06:27:09
问题 I want to test my Kafka Streams topology with a unit test using kafka-streams-test-utils. I'm using this library already a longer time and I built already some abstract layer around my tests using TestNG. But since I added a merge(...) to my Stream, I got the following Exception: org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000001, topic=my-topic-2, partition=0, offset=0 at org.apache.kafka.streams.processor.internals

ClassCastException in spring-kafka-test using `merger()`

寵の児 提交于 2020-05-16 06:26:47
问题 I want to test my Kafka Streams topology with a unit test using kafka-streams-test-utils. I'm using this library already a longer time and I built already some abstract layer around my tests using TestNG. But since I added a merge(...) to my Stream, I got the following Exception: org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_0, processor=KSTREAM-SOURCE-0000000001, topic=my-topic-2, partition=0, offset=0 at org.apache.kafka.streams.processor.internals

Kafka Streams 2.5.0 requires input topic

浪尽此生 提交于 2020-05-15 08:04:10
问题 Starting with Kafka Streams 2.5.0 it seems like a topology must include an input topic. In Kafka 2.4.1 (and earlier) that is not the case. I have an application where the topology is just creating a few global state stores that read in data from topics written to by other applications. With Kafka 2.5.0 I get this error: 13:24:27.161 [<redacted>-7cf1b5c9-4a6e-4bf2-9f77-f7f85f2df3bb-StreamThread-1] ERROR o.a.k.s.p.internals.StreamThread - stream-thread [<redacted>-7cf1b5c9-4a6e-4bf2-9f77

Kafka Streams application Endless rebalancing

家住魔仙堡 提交于 2020-05-12 07:02:32
问题 We are running a kafka streams application and stuck with a strange problem. We are using both global state store and multiple other state stores. Our application has loaded all the data and state stores has good amount of information in it now. Now, when we tried to bring down the application and bring it back again (some config changes), it is going into endless rebalancing .. To verify we reverted back config changes, but it it still stuck in that stage. There are no erros, etc INFO o

Kafka Streams application Endless rebalancing

ぐ巨炮叔叔 提交于 2020-05-12 07:02:30
问题 We are running a kafka streams application and stuck with a strange problem. We are using both global state store and multiple other state stores. Our application has loaded all the data and state stores has good amount of information in it now. Now, when we tried to bring down the application and bring it back again (some config changes), it is going into endless rebalancing .. To verify we reverted back config changes, but it it still stuck in that stage. There are no erros, etc INFO o

Can Kafka Streams be configured to wait for KTable to load?

北城余情 提交于 2020-05-11 03:21:32
问题 I'm using materialized KTable to use for left join with my KStream(while the stream is the left side). However, it seem to process immediately, without waiting for the current version of the KTable to load.. I have a lot of values in my source topic for the KTable and when I start the application, a lot of joins fail(well, not really since it is a left join). Can I make it start in delay so it would wait for the initial topic load? 回答1: Processing is time synchronized in Kafka Streams. Hence,

Can Kafka Streams be configured to wait for KTable to load?

微笑、不失礼 提交于 2020-05-11 03:21:09
问题 I'm using materialized KTable to use for left join with my KStream(while the stream is the left side). However, it seem to process immediately, without waiting for the current version of the KTable to load.. I have a lot of values in my source topic for the KTable and when I start the application, a lot of joins fail(well, not really since it is a left join). Can I make it start in delay so it would wait for the initial topic load? 回答1: Processing is time synchronized in Kafka Streams. Hence,

Can Kafka Streams be configured to wait for KTable to load?

百般思念 提交于 2020-05-11 03:20:07
问题 I'm using materialized KTable to use for left join with my KStream(while the stream is the left side). However, it seem to process immediately, without waiting for the current version of the KTable to load.. I have a lot of values in my source topic for the KTable and when I start the application, a lot of joins fail(well, not really since it is a left join). Can I make it start in delay so it would wait for the initial topic load? 回答1: Processing is time synchronized in Kafka Streams. Hence,