Versioning large text files in git

后端 未结 4 1958
不知归路
不知归路 2020-12-30 04:44

I\'ve used git for awhile for source control and I really like it. So I started investigating using git to store lots of large binary files, which I\'m finding just isn\'t

相关标签:
4条回答
  • 2020-12-30 05:23

    One of the side effects of large files is that git diff can run out of memory.

    While Git isn't the right tool (as mentioned in the other answers), at least the git diff issue is mitigated in git 2.2.0 (Q4 2014).
    See commit 6bf3b81 from Nguyễn Thái Ngọc Duy (pclouds):

    diff --stat: mark any file larger than core.bigfilethreshold binary

    Too large files may lead to failure to allocate memory.
    If it happens here, it could impact quite a few commands that involve diff.
    Moreover, too large files are inefficient to compare anyway (and most likely non-text), so mark them binary and skip looking at their content.

    0 讨论(0)
  • 2020-12-30 05:37

    I don't think git will do a good job at storing deltas in general, and even if you can finagle it to do so, it won't be deterministic. That said, based on http://metalinguist.wordpress.com/2007/12/06/the-woes-of-git-gc-aggressive-and-how-git-deltas-work/, you may want to try git repack -a -d --depth=250 --window=250.

    I suspect your best option is to truncate your history using git --rebase, and only store the past few backups. You could do this using git branches. Make a branch called yearly, monthly, and daily. Every day, commit to daily, then use git rebase --onto HEAD~4 HEAD~3 daily to delete backups older than 3 days old. On the first day of every week, checkout weekly and git cherry-pick daily, then do the same git rebase to remove weekly backups older than 3 weeks. Finally, on the first day of every year, follow a similar process. You will probably want to do a git gc after this sequence each time, to free up the old space.

    But if you're doing this, you're not taking advantage of git anymore and abusing the way it works a fair amount. I think the best backup solution for you does not involve git.

    0 讨论(0)
  • 2020-12-30 05:37

    While how much difference you see after packing the objects is based on the type of files etc, git is not a backup tool and should not be used for that case. If you look at the entire philosophy of git, it is based on the assumption that disk space is cheap and makes optimization on the speed of the operations. Also, whether the type of file is binary or text, git is going to store it the same way, and ofcourse, as mentioned above, the type of file will determine how much difference you see after packing. It is only for diff and other purposes that git makes a distinction between binary and text files and not for storing.

    Use appropriate backup tool which will also save you disk space. Something like ZFS for backups will be worth trying out: https://svn.oss.prd/repos/SHAW/BuildAndReleaseTransition/TeamCity/TeamCityConfiguration-39/TeamCityConfiguration.docx

    0 讨论(0)
  • 2020-12-30 05:43

    Git isn't the greatest backup tool, but it should be able to handle appending to a text file very efficiently. I was suspicious of your results. I repeated your experiment with a 354 meg file and git 1.7.7 on OS X. Here's my actions and the size of .git.

    1. git init (52K)
    2. git add mbox && git commit (110M)
    3. cat mail1 >> mbox && git commit -a -m (219M)
    4. git gc (95M)
    5. cat mail2 >> mbox && git commit -a -m (204M)
    6. git gc (95M)

    As you can see, git is being very efficient. 94 megs is the size of the compressed mbox. It can't get much smaller.

    I'm guessing your either using an old version of git or your mbox file is being compressed or encrypted by your mailer.

    • Check that the contents of your mbox which git is seeing is plain text.
    • If you're not using the latest git, upgrade and try again.
    0 讨论(0)
提交回复
热议问题