Fetch/Pull Part of Very Large Repository?

后端 未结 1 621
再見小時候
再見小時候 2021-02-08 14:22

This is probably obvious and has been asked many times in different ways before, but I have not been able to find the answer after searching for some time.

Assume the fo

相关标签:
1条回答
  • 2021-02-08 15:05

    The answer to "Partial cloning" can help you start experimenting with shallow clones.
    But it will be limited:

    • to a certain depth, and/or to certain branches,
    • but not to certain files or directories (you can get a file or directory though sparse checkout, but you still have to get the full repo first!)
    • Even a certain commit.
      (Git 2.5 (Q2 2015) supports a single fetch commit! See "Pull a specific commit from a remote git repository").

    The real solution would be to separate the huge remote repo into submodules though.
    See What are Git limits or Git style backup of binary files for illustrating this kind of situation.


    Update April 2015:

    Git Large File Storage (LFS) would make pull/fetch much more efficient (by GitHub, April 2015).

    The project is git-lfs (see git-lfs.github.com) and tested with server supporting it: lfs-test-server:
    You can store metadata only in the git repo, and the large file elsewhere.

    https://cloud.githubusercontent.com/assets/1319791/7051226/c4570828-ddf4-11e4-87eb-8fc165e5ece4.gif

    0 讨论(0)
提交回复
热议问题