Kafka Elasticsearch Connector Timestamps

梦想与她 提交于 2019-12-02 05:04:35

问题


I can see this has been discussed a few times here for instance but I think the solutions are out of date due to breaking changes in Elasticsearch.

I'm trying to convert a long/epoch field in my Json in my Kafka topic to an Elasticsearch date type which is pushed through the connector.

When I try to add a dynamic mapping, my Kafka connect updates fail because Im trying to apply two mappings to a field, _doc and kafkaconnect. This was a breaking change around version 6 I believe where you can only have one mapping per index.

{
    "index_patterns": [ "depart_details" ],
  "mappings": {
    "dynamic_templates": [
      {
        "scheduled_to_date": {
          "match":   "scheduled",
          "mapping": {
            "type": "date"
          }
        }
      } 
    ]
}}

I've now focussed on trying to translate the message at source in the connector by changing the field to a timestamp, time or date.

    "transforms.TimestampConverter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
        "transforms.TimestampConverter.field" : "scheduled",
        "transforms.TimestampConverter.target.type": "Timestamp"

However, any messages I try to send through this transformer fail with

Caused by: org.apache.kafka.connect.errors.DataException: Java class class java.util.Date does not have corresponding schema type.
    at org.apache.kafka.connect.json.JsonConverter.convertToJson(JsonConverter.java:604)
    at org.apache.kafka.connect.json.JsonConverter.convertToJson(JsonConverter.java:668)
    at org.apache.kafka.connect.json.JsonConverter.convertToJsonWithoutEnvelope(JsonConverter.java:574)
    at org.apache.kafka.connect.json.JsonConverter.fromConnectData(JsonConverter.java:324)
    at io.confluent.connect.elasticsearch.DataConverter.getPayload(DataConverter.java:181)
    at io.confluent.connect.elasticsearch.DataConverter.convertRecord(DataConverter.java:163)
    at io.confluent.connect.elasticsearch.ElasticsearchWriter.tryWriteRecord(ElasticsearchWriter.java:285)
    at io.confluent.connect.elasticsearch.ElasticsearchWriter.write(ElasticsearchWriter.java:270)
    at io.confluent.connect.elasticsearch.ElasticsearchSinkTask.put(ElasticsearchSinkTask.java:169)

Seems like a really common thing to need to do, but I don't see how to get a date or time field into Elastic through this connector in version 7?


回答1:


The Confluent documentation states that the ES connector is currently not supported with ES 7.

According to this issue, it might suffice to change type.name=kafkaconnect to type.name=_doc in your connector configuration.



来源:https://stackoverflow.com/questions/58076599/kafka-elasticsearch-connector-timestamps

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!