问题
I am working on an application that depends on a MySQL database for parts of the content. In some cases, the application will run in an environment with limited internet connectivity (UMTS), in particular suffering from high latencies.
A user of the application is able to login, and most of the content for the user interface of the application is retrieved from a MySQL database. To prevent delays after a user logs in, I would like to cache as much of the database content at the client side as possible. New content should only be retrieved from the database in case a relevant change was made. Is there a common way to approach this problem? Any literature that describes solid solutions to this problem?
Small update: I am now looking into CouchDB as the solution for my particular use-case. The main reason being that:
it allows for users and servers to access and update the same shared data while disconnected and then bi-directionally replicate those changes later.
(from: http://couchdb.apache.org/docs/overview.html)
So far it looks really promising.
回答1:
We actually run a local copy of the database and have most of the software just talk to the local database which is always available. Then we have a background process that works to keep the two databases in sync.
I will warn you that any type of local caching is not for the faint of heart. There are soooo many ways it can get out of sync. Avoid doing it if you can.
回答2:
I'm not sure if this is common practice, but you could write the cached data to a JavaScript file that the client would cache. Then you can make Ajax calls to see if there have been any changes... Which would be a small request. If no changes, use the cached data.
来源:https://stackoverflow.com/questions/2217786/best-practice-for-a-local-database-cache