The main goal is the aggregate two Kafka topics, one compacted slow moving data and the other fast moving data which is received every second.
I have been able to c
Change KafkaIO.<Long, String>read()
to KafkaIO.<Long, Object>read()
.
If you look into the implementation of KafkaAvroDeserializer, it implements Deserializer:
public class KafkaAvroDeserializer extends AbstractKafkaAvroDeserializer implements Deserializer<Object>
I had a similar issue today, and came across the following example which resolved it for me.
https://github.com/andrewrjones/debezium-kafka-beam-example/blob/master/src/main/java/com/andrewjones/KafkaAvroConsumerExample.java
the missing piece for me was (Class)KafkaAvroDeserializer
KafkaIO.<String, MyClass>read()
.withBootstrapServers("kafka:9092")
.withTopic("dbserver1.inventory.customers")
.withKeyDeserializer(StringDeserializer.class)
.withValueDeserializerAndCoder((Class)KafkaAvroDeserializer.class, AvroCoder.of(MyClass.class))
I have faced the same issue. Found the solution in this mail-archives. http://mail-archives.apache.org/mod_mbox/beam-user/201710.mbox/%3CCAMsy_NiVrT_9_xfxOtK1inHxb=x_yAdBcBN+4aquu_hn0GJ0nA@mail.gmail.com%3E
In your case, you need to defined your own KafkaAvroDeserializer like as follows.
public class MyClassKafkaAvroDeserializer extends
AbstractKafkaAvroDeserializer implements Deserializer<MyClass> {
@Override
public void configure(Map<String, ?> configs, boolean isKey) {
configure(new KafkaAvroDeserializerConfig(configs));
}
@Override
public MyClass deserialize(String s, byte[] bytes) {
return (MyClass) this.deserialize(bytes);
}
@Override
public void close() {} }
Then specify your KafkaAvroDeserializer as ValueDeserializer.
p.apply(KafkaIO.<Long, MyClass>read()
.withKeyDeserializer(LongDeserializer.class)
.withValueDeserializer(MyClassKafkaAvroDeserializer.class) );
You can use KafkaAvroDeserializer as following:
PCollection<KV<Long,MyClass>> input = p.apply(KafkaIO.<Long, String>read()
.withKeyDeserializer(LongDeserializer.class)
.withValueDeserializerAndCoder(KafkaAvroDeserializer.class, AvroCoder.of(MyClass.class))
Where MyClass is the POJO class generated Avro Schema.
Make sure your POJO class has annotation AvroCoder as in below example :
@DefaultCoder(AvroCoder.class)
public class MyClass{
String name;
String age;
MyClass(){}
MyClass(String n, String a) {
this.name= n;
this.age= a;
}
}
Yohei's answer is good, but I also found this to work
import io.confluent.kafka.streams.serdes.avro.SpecificAvroDeserializer;
...
public static class CustomKafkaAvroDeserializer extends SpecificAvroDeserializer<MyCustomClass> {}
...
.withValueDeserializerAndCoder(CustomKafkaAvroDeserializer.class, AvroCoder.of(MyCustomClass.class))
...
where MyCustomClass
is code gen'd with Avro tools.