Best approach to ingest Streaming Data in Lagom Microservice

空扰寡人 提交于 2019-12-04 16:47:21
  1. Lagom's streaming calls use WebSockets. It's built on Play's WebSocket support, which can scale to millions of connected clients. I wouldn't call hundreds of wifi sensors a huge amount of data, Lagom should easily handle it, and Lagom can also easily be scaled horizontally, so if the processing you're doing is heavy, you can easily spread that processing across many nodes.

  2. Publishing an incoming WebSocket stream to Kafka is not currently supported in Lagom. While Kafka does guarantee at least once once a message is published to Kafka, there are no such guarantees when getting that message into Kafka in the first instance. For example, if you do a side effect, such as update a database, then publish a message, there's no guarantee that if the application crashes between when the database is updated, and when the message is published to Kafka, that that message will eventually be published to Kafka (in fact it won't be, that message will be lost). This is why Lagom encourages only database event streams to be published to Kafka, since publishing the event log in that way does guarantee that any database operation that then needs to be sent to Kafka does happen at least once. However, if you're not doing side effects, which it sounds like you're not, then this might not be relevant to you. What I would recommend in that case would be to use akka-streams-kafka (what Lagom's Kafka support is built on) directly.

I've raised an issue referencing your use case here.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!