I am trying to create a microservice architecture using Lumen / Laravel Passport.
I have a multiple dockerized services, which all run as an separa
Let's say you have services: MS1, MS2, MS3, MS4. The web app / mobile app hits MS1 for information. Now MS1 needs to return a response containing data that are managed by MS2, MS3 and MS4.
Poor Solution - MS1 calls MS2, MS3 and MS4 to retrieve information, aggregates them and returns the final aggregated data
Use log-based change data capture (CDC) to generate events from databases of MS2, MS3 and MS4 as and when the DBs are updated by their respective services
Post the events to one or more topics of a streaming platform (e.g. Kafka)
Using stream processing, process the events and create the aggregated data for each user in cache and DB of MS1
Serve the requests to MS1 from the cache and / or DB of MS1
Note, with this approach, the cache or DB will have pre-aggregated data which will be kept up-to-date by the event and stream processing. The updates may lag a little resulting in serving stale data. But the delay shouldn't be more than a few seconds in normal circumstances.
If all the user data can be stored in cache, you can keep the entire data set in cache. Otherwise, you can keep a subset of data in cache with a TTL. The least recently used data can be evicted to make space for new entries. The service will retrieve data from the DB unless itbis not already available in cache.
Advantages:
You likely need to investigate a caching layer inside the client application. You don't want to break your encapsulation, but caching this information as close to where it is used as possible will make a huge difference in optimizing the chattiness of your microservices. Just one point though, ensure that you end up creating a cache, and not a distributed store. Cache's will still need revalidation, and an expiration timeline.