Kafka consumer for multiple topic

前端 未结 2 975
栀梦
栀梦 2021-01-01 10:18

I have a list of topics (for now it\'s 10) whose size can increase in future. I know we can spawn multiple threads (per topic) to consume from each topic, but in my case if

相关标签:
2条回答
  • 2021-01-01 10:30

    We can subscribe for multiple topic using following API : consumer.subscribe(Arrays.asList(topic1,topic2), ConsumerRebalanceListener obj)

    Consumer has the topic info and we can comit using consumer.commitAsync or consumer.commitSync() by creating OffsetAndMetadata object as follows.

    ConsumerRecords<String, String> records = consumer.poll(long value);
    for (TopicPartition partition : records.partitions()) {
        List<ConsumerRecord<String, String>> partitionRecords = records.records(partition);
        for (ConsumerRecord<String, String> record : partitionRecords) {
            System.out.println(record.offset() + ": " + record.value());
        }
        long lastOffset = partitionRecords.get(partitionRecords.size() - 1).offset();
        consumer.commitSync(Collections.singletonMap(partition, new OffsetAndMetadata(lastOffset + 1)));
    }
    
    0 讨论(0)
  • 2021-01-01 10:44

    There is no need for multiple threads, you can have one consumer, consuming from multiple topics. Offsets are maintained by zookeeper, as kafka-server itself is stateless. Whenever a consumer consumes a message,its offset is commited with zookeeper to keep a future track to process each message only once. So even in case of kafka failure, consumer will start consuming from the next of last commited offset.

    0 讨论(0)
提交回复
热议问题