Caching sitemaps in Django

前端 未结 4 1522
日久生厌
日久生厌 2020-12-31 16:44

I implemented a simple sitemap class using Django\'s default sitemap application. As it was taking a long time to execute, I added manual caching:

class Short         


        
相关标签:
4条回答
  • 2020-12-31 16:48

    I have about 200,000 pages on my site, so I had to have the index no matter what. I ended up doing the hack, limiting the sitemap to 250 links, and also implementing a file-based cache.

    The basic algorithm is this:

    • Try to load the sitemap from a file on disk
    • If that fails, generate the sitemap, and
    • If the sitemap contains 250 links (the number set above), save it to disk and then return it.

    The end result is that the first time a sitemap is requested, if it's complete, it's generated and saved to disk. The next time it's requested, it's simply served from disk. Since my content never changes, this works very well. However, if I do want to change a sitemap, it's as simple as deleting the file(s) from disk, and waiting for the crawlers to come regenerate things.

    The code for the whole thing is here, if you're interested: http://bitbucket.org/mlissner/legal-current-awareness/src/tip/alert/alertSystem/sitemap.py

    Maybe this will be a good solution for you too.

    0 讨论(0)
  • 2020-12-31 17:01

    You can serve sitemaps also in gzip format, which makes them a lot smaller. XML is suited perfectly for gzip compression. What I sometimes do: Create the gzipped sitemap file(s) in a cronjob and render them as often as necessary. Usually, once a day will suffice. The code for this may look like this. Just make sure to have your sitemap.xml.gz served from your domain root:

        from django.contrib.sitemaps import GenericSitemap
        from django.contrib.sitemaps.views import sitemap
        from django.utils.encoding import smart_str
        import gzip
        sitemaps = {
            'page': GenericSitemap({'queryset': MyModel.objects.all().order_by('-created'), 'date_field': 'created'}),
        }
        f = gzip.open(settings.STATIC_ROOT+'/sitemap.xml.gz', 'wb')
        f.write(smart_str(sitemap(request, sitemaps=sitemaps).render().content))
        f.close()
    

    This should get you started.

    0 讨论(0)
  • 2020-12-31 17:02

    Assuming you don't need all those pages in your sitemap then reducing the limit to get the file size down will work fine as described in the previous answer.

    If you do want a very large sitemap and do want to use Memcached you could split the content up into multiple chunks, store them under individual keys and then put them back together again on output. To make this more efficient, Memcached supports the ability to get multiple keys at the same time, although I'm not sure whether the Django client supports this capability yet.

    For reference, the 1 MB limit is a feature of Memcached to do with how it stores data: http://code.google.com/p/memcached/wiki/FAQ#What_is_the_maximum_data_size_you_can_store?_(1_megabyte)

    0 讨论(0)
  • 2020-12-31 17:14

    50k is not a hard coded parameter.

    You can use class django.contrib.sitemaps.GenericSitemap instead:

    class LimitGenericSitemap(GenericSitemap):
        limit = 2000
    
    0 讨论(0)
提交回复
热议问题