We have a C++ application for which we try to improve performance. We identified that data retrieval takes a lot of time, and want to cache data. We can\'t store all data in mem
Update: I got it now...
This should be reasonably fast. Warning, some pseudo-code ahead.
// accesses contains a list of id's. The latest used id is in front(),
// the oldest id is in back().
std::vector accesses;
std::map cache;
CachedItem* get(long id) {
if (cache.has_key(id)) {
// In cache.
// Move id to front of accesses.
std::vector::iterator pos = find(accesses.begin(), accesses.end(), id);
if (pos != accesses.begin()) {
accesses.erase(pos);
accesses.insert(0, id);
}
return cache[id];
}
// Not in cache, fetch and add it.
CachedItem* item = noncached_fetch(id);
accesses.insert(0, id);
cache[id] = item;
if (accesses.size() > 1000)
{
// Remove dead item.
std::vector::iterator back_it = accesses.back();
cache.erase(*back_it);
accesses.pop_back();
}
return item;
}
The inserts and erases may be a little expensive, but may also not be too bad given the locality (few cache misses). Anyway, if they become a big problem, one could change to std::list.