Reducing repository size in Mercurial

前端 未结 4 1586
悲&欢浪女
悲&欢浪女 2021-02-07 01:26

As my team works on a given project with the source in a Mercurial repository, the repository is obviously growing in size. As such, cloning a repository over the network become

4条回答
  •  太阳男子
    2021-02-07 02:05

    If you only need the files in a given revision, but never need to examine history or make new commits, then downloading a snapshot can be faster.

    The normal hgweb CGI script can provide a zip or tar file for any revision. The archives are generated on the fly. You only need to add

    [web]
    allow_archive = gz, zip, bz2
    

    to your configuration file. You can then find archives under URLs like

    http://server.com/repo/archive/rev.zip
    

    Replace the revision number with the branch name it changeset hash you want Download the file with wget, curl, or a similar tool.

    This strategy only pays off when the history is very large compared to the size of a single changeset.

    This can be the case if the repository contains large files that change often. The largefiles extension can be an alternative here: it allows you to only download the files needed for the revision you checkout. That way you avoid downloading the history for big files and save significant amounts of bandwidth.

提交回复
热议问题