What\'s a good way to survive abnormally high traffic spikes?
My thought is that at some trigger, my website should temporarily switch into a \"low bandwidth\" mode: swi
Make sure your pages support Last-Modified & If-Modified-Since and/or ETag & If-None-Match headers. With these you can avoid many computations and transfers totally.
Search for HTTP conditional GET for more information.
For sites that experience high traffic, Akamai is a good solution to make the site fast, extraordinarily scalable, and reliable in spite of your own infrastructure. Akamai is a service (not free) which will cache your site a locations around the world. At my last job, our e-commerce catalog was cached via them and our servers could go down and nobody would know unless they tried adding to their cart. Also, we had our image servers go down once and Akamai's caching saved us again.
You can also use Nagios to monitor the server health. Based on your requirements, at certain conditions, you can trigger an existing SQL file to switch modes for your website.
For example, add "UPDATE settings_table SET bandwidth = 'low';" into that SQL file and run it in mysql and do the opposite when the conditions get back to normal.
There's simply no way to know whether or not your website will survive heavy loads unless you stress test it. Use something like siege and see where your performance problems lie. Does it grow in memory too quickly? Does it start slowing down with a bunch of concurrent connections? Does it start taking forever to access the database?
Once you know where the performance problems lie, then it becomes a matter of getting rid of them. Unfortunately, it's difficult to go into much more detail than that without knowing more about your particular situation, but keep in mind that you ARE talking about optimizations here. Thus, you should only act when you KNOW there are performance problems.
And I would argue that you're not necessarily just preparing for a once in a lifetime event. DOS attacks still happen, so it's good to have preparations in place even if your site doesn't get slashdotted.
The only thing that I can think of off the top of my head that will help you in almost all situations is if you gzip your content. That will save a lot of bandwidth and all modern browsers will support it without too much of a performance problem.
Auto-redirect to Coral CDN, unless the request is from coral cdn.
nearlyfreespeech.net is a semi-cloud so to speak and helps a ton in situations like this. As others above mentioned, layered caching helps a lot. Pull chunks of information from memcached instead of the database, have a reverse proxy (or a distributed reverse proxy aka CDN, Panther Networks is cheap) in front of you.