confluent

How pass Basic Authentication to Confluent Schema Registry?

巧了我就是萌 提交于 2020-06-26 07:03:50
问题 I want to read data from a confluent cloud topic and then write in another topic. At localhost, I haven't had any major problems. But the schema registry of confluent cloud requires to pass some authentication data that I don't know how to enter them: basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=: schema.registry.url=https://xxxxxxxxxx.confluent.cloudBlockquote Below is the current code: import com.databricks.spark.avro.SchemaConverters import io.confluent

How pass Basic Authentication to Confluent Schema Registry?

寵の児 提交于 2020-06-26 07:01:17
问题 I want to read data from a confluent cloud topic and then write in another topic. At localhost, I haven't had any major problems. But the schema registry of confluent cloud requires to pass some authentication data that I don't know how to enter them: basic.auth.credentials.source=USER_INFO schema.registry.basic.auth.user.info=: schema.registry.url=https://xxxxxxxxxx.confluent.cloudBlockquote Below is the current code: import com.databricks.spark.avro.SchemaConverters import io.confluent

Confluent Kafka & docker-compose - error running example

时间秒杀一切 提交于 2020-06-17 02:01:06
问题 I'm trying to run the Confluent Platform all in one example using Docker Compose. The example of using it with a single node is here: http://docs.confluent.io/3.1.1/cp-docker-images/docs/quickstart.html#getting-started-with-docker-compose The git repository with all the Docker images also has a load of other examples, including one which is supposed to provide the Control panel etc, as detailed here: http://docs.confluent.io/3.1.2/cp-docker-images/docs/intro.html#choosing-the-right-images.

Confluent Kafka & docker-compose - error running example

ε祈祈猫儿з 提交于 2020-06-17 02:00:57
问题 I'm trying to run the Confluent Platform all in one example using Docker Compose. The example of using it with a single node is here: http://docs.confluent.io/3.1.1/cp-docker-images/docs/quickstart.html#getting-started-with-docker-compose The git repository with all the Docker images also has a load of other examples, including one which is supposed to provide the Control panel etc, as detailed here: http://docs.confluent.io/3.1.2/cp-docker-images/docs/intro.html#choosing-the-right-images.

Kafka setup with docker-compose

落花浮王杯 提交于 2020-05-09 21:06:49
问题 Hi I'm currently setting up Kafka with Docker. I've managed to setup Zookeeper and Kafka with the published confluent image, see following docker-compose file: version: '2' services: zookeeper: image: confluentinc/cp-zookeeper:3.2.0 container_name: zookeeper hostname: zookeeper ports: - "2181:2181" environment: ZOOKEEPER_CLIENT_PORT: 2181 ZOOKEEPER_TICK_TIME: 2000 restart: always kafka: image: confluentinc/cp-kafka:3.2.0 hostname: kafka container_name: kafka depends_on: - zookeeper ports: -

Kafka Connect: No tasks created for a connector

大兔子大兔子 提交于 2020-03-19 05:07:29
问题 We are running Kafka Connect (Confluent Platform 5.4, ie. Kafka 2.4) in a distributed mode using Debezium (MongoDB) and Confluent S3 connectors. When adding a new connector via the REST API the connector is created in RUNNING state, but no tasks are created for the connector. Pausing and resuming the connector does not help. When we stop all workers and then start them again, the tasks are created and everything runs as it should. The issue is not caused by the connector plugins, because we

Kafka Connect: No tasks created for a connector

半世苍凉 提交于 2020-03-19 05:04:49
问题 We are running Kafka Connect (Confluent Platform 5.4, ie. Kafka 2.4) in a distributed mode using Debezium (MongoDB) and Confluent S3 connectors. When adding a new connector via the REST API the connector is created in RUNNING state, but no tasks are created for the connector. Pausing and resuming the connector does not help. When we stop all workers and then start them again, the tasks are created and everything runs as it should. The issue is not caused by the connector plugins, because we

Kafka Connect: No tasks created for a connector

和自甴很熟 提交于 2020-03-19 05:04:23
问题 We are running Kafka Connect (Confluent Platform 5.4, ie. Kafka 2.4) in a distributed mode using Debezium (MongoDB) and Confluent S3 connectors. When adding a new connector via the REST API the connector is created in RUNNING state, but no tasks are created for the connector. Pausing and resuming the connector does not help. When we stop all workers and then start them again, the tasks are created and everything runs as it should. The issue is not caused by the connector plugins, because we

Error when trying to join a table and a stream

≡放荡痞女 提交于 2020-02-05 05:31:27
问题 I am trying to join a table and a stream and create another table as shown below: CREATE TABLE table_fx_latest AS SELECT t1.currencyid, t1.maxtimestamp, t2.midprice FROM stream_fx2 t2 LEFT JOIN table_fx_latest3 t1 ON t1.currencyid = t2.currencyid AND t1.timestamp = t2.maxtimestamp GROUP BY t1.currencyid, t1.maxtimestamp, t2.midprice; but the following error is reported: Cannot RUN execution plan for this statement, CreateTableAsSelect{name=TABLE_FX_LATEST_PRICE6, query=Query{queryBody

Batch Size in kafka jdbc sink connector

﹥>﹥吖頭↗ 提交于 2020-01-25 03:57:21
问题 I want to read only 5000 records in a batch through jdbc sink, for which I've used the batch.size in the jdbc sink config file: name=jdbc-sink connector.class=io.confluent.connect.jdbc.JdbcSinkConnector tasks.max=1 batch.size=5000 topics=postgres_users connection.url=jdbc:postgresql://localhost:34771/postgres?user=foo&password=bar file=test.sink.txt auto.create=true But the batch.size has no effect as records are getting inserted into the database when new records are inserted into the source