Message encryption in kafka streaming

有些话、适合烂在心里 提交于 2020-01-25 06:49:14

问题


I am recently trying to play with kafka streaming for some sensitive data processing. The goal I wish to achieve is that while the sensitive data is encrypted, the power of microservice architecture is not jeopardised, i.e., losely coupled services and stream data processing.

My question is, in kafka streaming, is it possible that I decrypt an incoming message using one key and encrypt it again with another key? I kind of got a plan, but as I am not familiar with kafka streaming, I can not prove that kafka streaming is capable of handling this function, using Streams DSL. Can anyone help me with this question, and preferably, tell me which function in Streams DSL can handle that?

Update: Basically, what I am asking is: I am trying to use public key encryption twice for a single message in the streaming pipeline. Once on the inbound topic and once on the outbound topic. Just not sure if DSL is able to decrypt and encrypt, and where the keys should be stored.

Thanks!


回答1:


If you're simply wanting to prevent others from inspecting your data, Kafka provides SSL connection for encryption between clients and brokers, although the at-rest data will still be unencrypted. You can add SASL to additional add authorization to limit who can access the cluster. Limiting SSH access to get to the broker files would also help

What you are asking for requires a custom Serializer and Deserializer combination, which is used by all Kafka APIs.

When using the Kafka Streams API, you would wrap these in a Serde class and provide that to your streams properties before you start it, or between two DSL methods by Produced.with or Consumed.with



来源:https://stackoverflow.com/questions/59726938/message-encryption-in-kafka-streaming

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!