What\'s a good way to survive abnormally high traffic spikes?
My thought is that at some trigger, my website should temporarily switch into a \"low bandwidth\" mode: swi
Increase the level of caching from the DB so that the content might me slightly more out of date but faster accessed. Naturally, this only applies if the content does not have to be 100% consistent.
The basics:
The real answers:
Cache data.
Unnecessary Trips to database to display something that gets displayed the same every load is what kills a server. Write its output to a file and use that instead. Most CMSs and frameworks have caching built in (but you have to turn it on) but rolling your own is not the most challenging task.
Make sure all pages you build are static, no database, and don't use images.
Actually, this place isn't doing THAT bad.
Here's a rather lengthy but highly informative article about surviving "flash crowds".
Here's their scenario for the situation their proposed solutions address:
In this paper, we consider the question of scaling through the eyes of a character we call the garage innovator. The garage innovator is creative, technically savvy, and ambitious. She has a great idea for the Next Big Thing on the web and implements it using some spare servers sitting out in the garage. The service is up and running, draws new visitors from time to time, and makes some meager income from advertising and subscriptions. Someday, perhaps, her site will hit the jackpot. Maybe it will reach the front page of Slashdot or Digg; maybe Valleywag or the New York Times will mention it.
Our innovator may get only one shot at widespread publicity. If and when that happens, tens of thousands of people will visit her site. Since her idea is so novel, many will become revenue-generating customers and refer friends. But a flash crowd is notoriously fickle; the outcome won't be nearly as idyllic if the site crashes under its load. Many people won't bother to return if the site doesn't work the first time. Still, it is hard to justify paying tens of thousands of dollars for resources just in case the site experiences a sudden load spike. Flash crowds are both the garage innovator's bane and her goal.
One way out of this conundrum has been enabled by contemporary utility computing.
The article then proposed a number of steps the garage innovator can take, such as using storage delivery networks and implementing highly-scalable databases.
I think the premise is wrong: you really really want to get slashdotted, otherwise you wouldn't have a web site in the first place. A much better question is how do you handle the extra traffic? And even that is really two questions: