I have a database driven website serving about 50,000 pages.
I want to track each webpage/record hit. I will do this by creating logs, and then batch processing the logs
All depends on your infrastructure and limitations. If the disk is slow, writing will be slow. If the SQL server is lagged by the requests, the insert will be slow. Flat file is probably the best way to go, but I would write your code or use existing code (PEAR::Log) so you can change the provider and storage method at will.