confluent

How to import MS Sql Server tables to KSQL with Kafka connect

主宰稳场 提交于 2019-12-01 09:51:50
问题 Hi I am trying to import all tables present on remote SQL Server to KSQL topics this is my file properties connector.class=io.confluent.connect.cdc.mssql.MsSqlSourceConnector name=sqlservertest tasks.max=1 initial.database=$$DATABASE connection.url=jdbc:sqlserver://$$IP:1433;databaseName=$$DATABASE;user=$$USER; username=$$USER password=$$PASS server.name=$$IP server.port=1433 topic.prefix=sqlservertest key.converter=io.confluent.connect.avro.AvroConverter key.converter.schema.registry.url

Confluent Maven repository not working?

僤鯓⒐⒋嵵緔 提交于 2019-11-30 11:13:54
I need to use the Confluent kafka-avro-serializer Maven artifact. From the official guide I should add this repository to my Maven pom <repository> <id>confluent</id> <url>http://packages.confluent.io/maven/</url> </repository> The problem is that the URL http://packages.confluent.io/maven/ seems to not work at the moment as I get the response below <Error> <Code>NoSuchKey</Code> <Message>The specified key does not exist.</Message> <Key>maven/</Key> <RequestId>15E287D11E5D4DFA</RequestId> <HostId> QVr9lCF0y3SrQoa1Z0jDWtmxD3eJz1gAEdivauojVJ+Bexb2gB6JsMpnXc+JjF95i082hgSLJSM= </HostId> </Error>

Confluent Maven repository not working?

纵饮孤独 提交于 2019-11-29 11:20:17
问题 I need to use the Confluent kafka-avro-serializer Maven artifact. From the official guide I should add this repository to my Maven pom <repository> <id>confluent</id> <url>http://packages.confluent.io/maven/</url> </repository> The problem is that the URL http://packages.confluent.io/maven/ seems to not work at the moment as I get the response below <Error> <Code>NoSuchKey</Code> <Message>The specified key does not exist.</Message> <Key>maven/</Key> <RequestId>15E287D11E5D4DFA</RequestId>

KafkaAvroDeserializer does not return SpecificRecord but returns GenericRecord

时光毁灭记忆、已成空白 提交于 2019-11-29 09:15:22
问题 My KafkaProducer is able to use KafkaAvroSerializer to serialize objects to my topic. However, KafkaConsumer.poll() returns deserialized GenericRecord instead of my serialized class. MyKafkaProducer KafkaProducer<CharSequence, MyBean> producer; try (InputStream props = Resources.getResource("producer.props").openStream()) { Properties properties = new Properties(); properties.load(props); properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, io.confluent.kafka.serializers

Connect to Kafka running in Docker from local machine

和自甴很熟 提交于 2019-11-26 16:44:05
I setup Single Node Basic Kafka Deployment using docker on my local machine like it is described in the Confluent Kafka documentation (steps 2-3). In addition, I also exposed zookeeper's port 2181 and kafka's port 9092 so that I'll be able to connect to them from java client running on local machine: $ docker run -d \ -p 2181:2181 \ --net=confluent \ --name=zookeeper \ -e ZOOKEEPER_CLIENT_PORT=2181 \ confluentinc/cp-zookeeper:4.1.0 $ docker run -d \ --net=confluent \ --name=kafka \ -p 9092:9092 \ -e KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 \ -e KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://kafka:9092 \

Connect to Kafka running in Docker from local machine

不问归期 提交于 2019-11-26 05:56:54
问题 I setup Single Node Basic Kafka Deployment using docker on my local machine like it is described in the Confluent Kafka documentation (steps 2-3). In addition, I also exposed zookeeper\'s port 2181 and kafka\'s port 9092 so that I\'ll be able to connect to them from java client running on local machine: $ docker run -d \\ -p 2181:2181 \\ --net=confluent \\ --name=zookeeper \\ -e ZOOKEEPER_CLIENT_PORT=2181 \\ confluentinc/cp-zookeeper:4.1.0 $ docker run -d \\ --net=confluent \\ --name=kafka \\