I have a site with about 150K pages in its sitemap. I\'m using the sitemap index generator to make the sitemaps, but really, I need a way of caching it, because building the 150
I'm using django-staticgenerator app for caching sitemap.xml to filesystem and update that file when data updated.
settings.py:
STATIC_GENERATOR_URLS = (
r'^/sitemap',
)
WEB_ROOT = os.path.join(SITE_ROOT, 'cache')
models.py:
from staticgenerator import quick_publish, quick_delete
from django.dispatch import receiver
from django.db.models.signals import post_save, post_delete
from django.contrib.sitemaps import ping_google
@receiver(post_delete)
@receiver(post_save)
def delete_cache(sender, **kwargs):
# Check if a Page model changed
if sender == Page:
quick_delete('/sitemap.xml')
# You may republish sitemap file now
# quick_publish('/', '/sitemap.xml')
ping_google()
In nginx configuration I redirect sitemap.xml to cache folder and django instance for fallback:
location /sitemap.xml {
root /var/www/django_project/cache;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
if (-f $request_filename/index.html) {
rewrite (.*) $1/index.html break;
}
# If file doesn't exist redirect to django
if (!-f $request_filename) {
proxy_pass http://127.0.0.1:8000;
break;
}
}
With this method, sitemap.xml will always be updated and clients(like google) gets xml file always staticly. That's cool I think! :)
I had a similar issue and decided to use django to write the sitemap files to disk in the static media and have the webserver serve them. I made the call to regenerate the sitemap every couple of hours since my content wasn't changing more often than that. But it will depend on your content how often you need to write the files.
I used a django custom command with a cron job, but curl with a cron job is easier.
Here's how I use curl, and I have apache send /sitemap.xml as a static file, not through django:
curl -o /path/sitemap.xml http://example.com/generate/sitemap.xml
Okay - I have found some more info on this and what amazon are doing with their 6 million or so URLS.
Amazon simply make a map for each day and add to it:
So this means that they end up with loads of site-maps - but the search bot will only look at the latest ones - as the updated dates are recent. I was under the understanding that one should refresh a map - and not include a url more than once. I think this is true. But, Amazon get around this as the site maps are more of a log. A url may appear in a later site-map - as it maybe updated - but Google wont look at the older maps as they are out of date - unless of course it does a major re-index. This approach makes a lot of sense as all you do is simply build a new map - say each day of new and updated content and ping it at google - thus google only needs to index these new urls.
This log approach is a synch to code - as all you need is a static data-store model that stores the XML data for each map. your cron job can build a map - daily or weekly and then store the raw XML page in a blob field or what have you. you can then serve the pages straight from a handler and also the index map too.
I'm not sure what others think but this sounds like a very workable approach and a load off ones server - compared to rebuilding huge map just because a few pages may have changed.
I have also considered that it may be possible to then crunch a weeks worth of maps into a week map and 4 weeks of maps into a month - so you end up with monthly maps, a map for each week in the current month and then a map for the last 7 days. Assuming that the dates are all maintained this will reduce the number of maps tidy up the process - im thinking in terms of reducing 365 maps for each day of the year down to 12.
Here is a pdf on site maps and the approaches used by amazon and CNN.
http://www.wwwconference.org/www2009/proceedings/pdf/p991.pdf
For those who (for whatever reason) would prefer to keep their sitemaps dynamically generated (eg freshness, lazyness). Try django-sitemaps. It's a streaming version of the standard sitemaps. Drop-in replacement. Much faster response time and uses waaaaay less memory.