Ive recently implemented memcache on my site which has been under heavy mysql load (mysql was as optimized as I could make it). It solved all my load issues, and site is running
Since you are caching entire pages in memcached, your pages can't share cached data from the database with each other. Say I have page1.php and page2.php, with page1 and page2 as keys in memcached. Both pages display items. I add a new item. Now I have to expire page1 and page2.
Instead, I could have an items key in memcached, that page1.php and page2.php both use to display items. When I add a new item, I expire the items key (or better, update it's value), and both page1.php and page2.php are up-to-date.
If you still want to cache the entire page, you could add information to your keys that will change when data being cached changes (this wouldn't make sense if the data changes too often). For instance:
"page1:[timestamp of newest item]"
This way you can look up the timestamp of the newest item, an inexpensive query, and build your cache key with it. Once a newer item is added, the cache key will change, automatically expiring. This method means you still have to hit the database to see what the newest item's timestamp is, every time.