问题
i'm developing a node.js backend for a real-time chat app on heroku. As i'm looking at dynos and ways to scale the node.js backend, i can see the advantage that dynos could have on http servers because each dyno can be independent from other dynos (which is ok for most cases).
My question is: how can you scale and handle load balancing of real-time socket.io apps? From what i'm reading dynos are containers that are 'sandboxed': each dyno runs its own process, independent from the other dynos.. so what is the best way to handle the problem?
I was thinking of a solution but it isn't elegant or pretty at all:
i could have multiple background jobs that contain crons that can check for new messages for the users connected on that instance.. but i think there must be a better solution.
回答1:
This problem can be generalized to how to get multiple node processes to share data whether they are on the same or different servers. As far as I have seen, the conventional wisdom is to have all of the processes read and write their data using a common database (Postgres, Mongo, Redis). Just use the correct database for your needs.
Another option would be something like MessengerJS which allows for interprocess communication. I don't know if this is a good idea for your application because then all of your dynos would end up containing a copy of the chat data. It would then be up to you to make sure everything was synced and consistent between the dynos. I would be more inclined to have a DB as a single source of truth.
来源:https://stackoverflow.com/questions/28554980/do-multiple-web-dynos-make-sense-in-a-real-time-socket-io-node-js-app