I have a database driven website serving about 50,000 pages.
I want to track each webpage/record hit. I will do this by creating logs, and then batch processing the logs
If you are using either file based logging or database based logging, your biggest performance hit will be file/table locking. Basically, if client A and client B connects within a relatively small time frame, client B is stuck waiting for the lock to be released on the hits file/table before continuing.
The problem with a file based mechanism is that file locking is essential to ensure that your hits doesn't get corrupted. The only way around that is to implement a queue to do a delayed write to the file.
With database logging, you can at least do the following [MySQL using MyISAM]:
INSERT DELAYED INTO `hits` ...
See 12.2.5.2. INSERT DELAYED Syntax for more information.