问题
I have a site with several pages for each company and I want to show how their page is performing in terms of number of people coming to this profile.
We have already made sure that bots are excluded.
Currently, we are recording each hit in a DB with either insert (for the first request in a day to a profile) or update (for the following requests in a day to a profile). But, given that requests have gone from few thousands per days to tens of thousands per day, these inserts/updates are causing major performance issues.
Assuming no JS solution, what will be the best way to handle this?
I am using Ruby on Rails, MySQL, Memcache, Apache, HaProxy for running overall show.
Any help will be much appreciated.
Thx
回答1:
http://www.scribd.com/doc/49575/Scaling-Rails-Presentation-From-Scribd-Launch you should start reading from slide 17. i think the performance isnt a problem, if it's possible to build solution like this for website as big as scribd.
回答2:
Here are 4 ways to address this, from easy estimates to complex and accurate:
- Track only a percentage (10% or 1%) of users, then multiply to get an estimate of the count.
- After the first 50 counts for a given page, start updating the count 1/13th of the time by a count of 13. This helps if it's a few page doing many counts while keeping small counts accurate. (use 13 as it's hard to notice that the incr isn't 1).
- Save exact counts in a cache layer like memcache or local server memory and save them all to disk when they hit 10 counts or have been in the cache for a certain amount of time.
- Build a separate counting layer that 1) always has the current count available in memory, 2) persists the count to it's own tables/database, 3) has calls that adjust both places
来源:https://stackoverflow.com/questions/3000925/tracking-impressions-visits-per-web-page