I have a database driven website serving about 50,000 pages.
I want to track each webpage/record hit. I will do this by creating logs, and then batch processing the logs
Write to file. Rotate logs.
Batch load the file to the database on a scheduled basis.
There are many, many reasons to choose this architecture -- ease of scaling (write to many logs, load them to db), lack of reliance on a SPOF in the database (if something goes wrong, you just accumulate logs for a while), ability to do cleaning and non-trivial parsing at load-time without burdening your production servers, and more.