问题
I am building a Kafka Consumer application that consumes messages from a Kafka Topic and performs a database update task. The messages are produced in a large batch once every day - so the Topic has about 1 million messages loaded in 10 minutes. The Topic has 8 partitions.
The Spring Kafka Consumer (annotated with @KafkaListener and using a ConcurrentKafkaListenerContainerFactory) is triggered in very short batches.
The batch size is sometimes just 1 or 2 messages. It would help performance if it can consume about 1000 messages at once and process it together (for example, I could update the database in a single update SQL), instead of connecting to the database for each message.
I have already tried to decrease the concurrency in the factory to avoid multiple threads consuming smaller number of messages.
I also increased the socket.send.buffer.bytes property in Kafka's server.properties to 1024000, from 102400.
These steps have not increased the batch size.
Is there any other configuration I could use to increase the bath size of the consumer?
回答1:
See kafka consumer properties max.poll.records
, fetch.min.bytes
, fetch.max.wait.ms
, fetch.max.bytes
, max.partition.fetch.bytes
.
Most likely fetch.min.bytes
and fetch.max.wait.ms
is what you need.
来源:https://stackoverflow.com/questions/50283011/how-to-increase-the-number-of-messages-consumed-by-spring-kafka-consumer-in-each