I have my Git repository which, at the root, has two sub directories:
/finisht
/static
When this was in SVN, /finisht
was chec
This will clone a specific folder and remove all history not related to it.
git clone --single-branch -b {branch} git@github.com:{user}/{repo}.git
git filter-branch --subdirectory-filter {path/to/folder} HEAD
git remote remove origin
git remote add origin git@github.com:{user}/{new-repo}.git
git push -u origin master
git clone --filter
from git 2.19 now works on GitHub (tested 2020-09-18, git 2.25.1)
This option was added together with an update to the remote protocol, and it truly prevents objects from being downloaded from the server.
E.g., to clone only objects required for d1
of this repository: https://github.com/cirosantilli/test-git-partial-clone I can do:
git clone \
--depth 1 \
--filter=blob:none \
--no-checkout \
https://github.com/cirosantilli/test-git-partial-clone \
;
cd test-git-partial-clone
git checkout master -- d1
The clone command obtains only:
master
branchd1
, d2
, master
Then, the checkout
command fetches only the missing blobs (files) from the server:
d1/a
d1/b
Even better, later on GitHub will likely start supporting:
--filter=blob:none \
--filter=tree:0 \
where --filter=tree:0 from Git 2.20 will prevent the unnecessary clone
fetch of all tree objects, and allow it to be deferred to checkout
. But on my 2020-09-18 test that fails with:
fatal: invalid filter-spec 'combine:blob:none+tree:0'
presumably because the --filter=combine:
composite filter (added in Git 2.24, implied by multiple --filter
) is not yet implemented.
I observed which objects were fetched with:
git verify-pack -v .git/objects/pack/*.pack
as mentioned at: Git - how to list ALL objects in the database It does not give me a super clear indication of what each object is exactly, but it does say the type of each object (commit
, tree
, blob
), and since there are so few objects in that minimal repo, I can unambiguously deduce what each object is.
git rev-list --objects --all
did produce clearer output with paths for tree/blobs, but it unfortunately fetches some objects when I run it, which makes it hard to determine what was fetched when, let me know if anyone has a better command.
TODO find GitHub announcement that saying when they started supporting it. https://github.blog/2020-01-17-bring-your-monorepo-down-to-size-with-sparse-checkout/ from 2020-01-17 already mentions --filter blob:none
.
git sparse-checkout
I think this command is meant to manage a settings file that says "I only care about these subtrees" so that future commands will only affect those subtrees. But it is a bit hard to be sure because the current documentation is a bit... sparse ;-)
It does not, by itself, prevent the fetching of blobs.
If this understanding is correct, then this would be a good complement to git clone --filter
described above, as it would prevent unintentional fetching of more objects if you intend to do git operations in the partial cloned repo.
When I tried on Git 2.25.1:
git clone \
--depth 1 \
--filter=blob:none \
--no-checkout \
https://github.com/cirosantilli/test-git-partial-clone \
;
cd test-git-partial-clone
git sparse-checkout init
it didn't work because the init
actually fetched all objects.
However, in Git 2.28 it didn't fetch the objects as desired. But then if I do:
git sparse-checkout set d1
d1
is not fetched and checked out, even though this explicitly says it should: https://github.blog/2020-01-17-bring-your-monorepo-down-to-size-with-sparse-checkout/#sparse-checkout-and-partial-clones With disclaimer:
Keep an eye out for the partial clone feature to become generally available[1].
[1]: GitHub is still evaluating this feature internally while it’s enabled on a select few repositories (including the example used in this post). As the feature stabilizes and matures, we’ll keep you updated with its progress.
So yeah, it's just too hard to be certain at the moment, thanks in part to the joys of GitHub being closed source. But let's keep an eye on it.
Command breakdown
The server should be configured with:
git config --local uploadpack.allowfilter 1
git config --local uploadpack.allowanysha1inwant 1
Command breakdown:
--filter=blob:none
skips all blobs, but still fetches all tree objects
--filter=tree:0
skips the unneeded trees: https://www.spinics.net/lists/git/msg342006.html
--depth 1
already implies --single-branch
, see also: How do I clone a single branch in Git?
file://$(path)
is required to overcome git clone
protocol shenanigans: How to shallow clone a local git repository with a relative path?
--filter=combine:FILTER1+FILTER2
is the syntax to use multiple filters at once, trying to pass --filter
for some reason fails with: "multiple filter-specs cannot be combined". This was added in Git 2.24 at e987df5fe62b8b29be4cdcdeb3704681ada2b29e "list-objects-filter: implement composite filters"
Edit: on Git 2.28, I experimentally see that --filter=FILTER1 --filter FILTER2
also has the same effect, since GitHub does not implement combine:
yet as of 2020-09-18 and complains fatal: invalid filter-spec 'combine:blob:none+tree:0'
. TODO introduced in which version?
The format of --filter
is documented on man git-rev-list
.
Docs on Git tree:
Test it out locally
The following script reproducibly generates the https://github.com/cirosantilli/test-git-partial-clone repository locally, does a local clone, and observes what was cloned:
#!/usr/bin/env bash
set -eu
list-objects() (
git rev-list --all --objects
echo "master commit SHA: $(git log -1 --format="%H")"
echo "mybranch commit SHA: $(git log -1 --format="%H")"
git ls-tree master
git ls-tree mybranch | grep mybranch
git ls-tree master~ | grep root
)
# Reproducibility.
export GIT_COMMITTER_NAME='a'
export GIT_COMMITTER_EMAIL='a'
export GIT_AUTHOR_NAME='a'
export GIT_AUTHOR_EMAIL='a'
export GIT_COMMITTER_DATE='2000-01-01T00:00:00+0000'
export GIT_AUTHOR_DATE='2000-01-01T00:00:00+0000'
rm -rf server_repo local_repo
mkdir server_repo
cd server_repo
# Create repo.
git init --quiet
git config --local uploadpack.allowfilter 1
git config --local uploadpack.allowanysha1inwant 1
# First commit.
# Directories present in all branches.
mkdir d1 d2
printf 'd1/a' > ./d1/a
printf 'd1/b' > ./d1/b
printf 'd2/a' > ./d2/a
printf 'd2/b' > ./d2/b
# Present only in root.
mkdir 'root'
printf 'root' > ./root/root
git add .
git commit -m 'root' --quiet
# Second commit only on master.
git rm --quiet -r ./root
mkdir 'master'
printf 'master' > ./master/master
git add .
git commit -m 'master commit' --quiet
# Second commit only on mybranch.
git checkout -b mybranch --quiet master~
git rm --quiet -r ./root
mkdir 'mybranch'
printf 'mybranch' > ./mybranch/mybranch
git add .
git commit -m 'mybranch commit' --quiet
echo "# List and identify all objects"
list-objects
echo
# Restore master.
git checkout --quiet master
cd ..
# Clone. Don't checkout for now, only .git/ dir.
git clone --depth 1 --quiet --no-checkout --filter=blob:none "file://$(pwd)/server_repo" local_repo
cd local_repo
# List missing objects from master.
echo "# Missing objects after --no-checkout"
git rev-list --all --quiet --objects --missing=print
echo
echo "# Git checkout fails without internet"
mv ../server_repo ../server_repo.off
! git checkout master
echo
echo "# Git checkout fetches the missing directory from internet"
mv ../server_repo.off ../server_repo
git checkout master -- d1/
echo
echo "# Missing objects after checking out d1"
git rev-list --all --quiet --objects --missing=print
GitHub upstream.
Output in Git v2.19.0:
# List and identify all objects
c6fcdfaf2b1462f809aecdad83a186eeec00f9c1
fc5e97944480982cfc180a6d6634699921ee63ec
7251a83be9a03161acde7b71a8fda9be19f47128
62d67bce3c672fe2b9065f372726a11e57bade7e
b64bf435a3e54c5208a1b70b7bcb0fc627463a75 d1
308150e8fffffde043f3dbbb8573abb6af1df96e63 d1/a
f70a17f51b7b30fec48a32e4f19ac15e261fd1a4 d1/b
84de03c312dc741d0f2a66df7b2f168d823e122a d2
0975df9b39e23c15f63db194df7f45c76528bccb d2/a
41484c13520fcbb6e7243a26fdb1fc9405c08520 d2/b
7d5230379e4652f1b1da7ed1e78e0b8253e03ba3 master
8b25206ff90e9432f6f1a8600f87a7bd695a24af master/master
ef29f15c9a7c5417944cc09711b6a9ee51b01d89
19f7a4ca4a038aff89d803f017f76d2b66063043 mybranch
1b671b190e293aa091239b8b5e8c149411d00523 mybranch/mybranch
c3760bb1a0ece87cdbaf9a563c77a45e30a4e30e
a0234da53ec608b54813b4271fbf00ba5318b99f root
93ca1422a8da0a9effc465eccbcb17e23015542d root/root
master commit SHA: fc5e97944480982cfc180a6d6634699921ee63ec
mybranch commit SHA: fc5e97944480982cfc180a6d6634699921ee63ec
040000 tree b64bf435a3e54c5208a1b70b7bcb0fc627463a75 d1
040000 tree 84de03c312dc741d0f2a66df7b2f168d823e122a d2
040000 tree 7d5230379e4652f1b1da7ed1e78e0b8253e03ba3 master
040000 tree 19f7a4ca4a038aff89d803f017f76d2b66063043 mybranch
040000 tree a0234da53ec608b54813b4271fbf00ba5318b99f root
# Missing objects after --no-checkout
?f70a17f51b7b30fec48a32e4f19ac15e261fd1a4
?8b25206ff90e9432f6f1a8600f87a7bd695a24af
?41484c13520fcbb6e7243a26fdb1fc9405c08520
?0975df9b39e23c15f63db194df7f45c76528bccb
?308150e8fffffde043f3dbbb8573abb6af1df96e63
# Git checkout fails without internet
fatal: '/home/ciro/bak/git/test-git-web-interface/other-test-repos/partial-clone.tmp/server_repo' does not appear to be a git repository
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
# Git checkout fetches the missing directory from internet
remote: Enumerating objects: 1, done.
remote: Counting objects: 100% (1/1), done.
remote: Total 1 (delta 0), reused 0 (delta 0)
Receiving objects: 100% (1/1), 45 bytes | 45.00 KiB/s, done.
remote: Enumerating objects: 1, done.
remote: Counting objects: 100% (1/1), done.
remote: Total 1 (delta 0), reused 0 (delta 0)
Receiving objects: 100% (1/1), 45 bytes | 45.00 KiB/s, done.
# Missing objects after checking out d1
?8b25206ff90e9432f6f1a8600f87a7bd695a24af
?41484c13520fcbb6e7243a26fdb1fc9405c08520
?0975df9b39e23c15f63db194df7f45c76528bccb
Conclusions: all blobs from outside of d1/
are missing. E.g. 0975df9b39e23c15f63db194df7f45c76528bccb
, which is d2/b
is not there after checking out d1/a
.
Note that root/root
and mybranch/mybranch
are also missing, but --depth 1
hides that from the list of missing files. If you remove --depth 1
, then they show on the list of missing files.
I have a dream
This feature could revolutionize Git.
Imagine having all the code base of your enterprise in a single repo without ugly third-party tools like repo.
Imagine storing huge blobs directly in the repo without any ugly third party extensions.
Imagine if GitHub would allow per file / directory metadata like stars and permissions, so you can store all your personal stuff under a single repo.
Imagine if submodules were treated exactly like regular directories: just request a tree SHA, and a DNS-like mechanism resolves your request, first looking on your local ~/.git, then first to closer servers (your enterprise's mirror / cache) and ending up on GitHub.
While I hate actually having to use svn when dealing with git repos :/ I use this all the time;
function git-scp() (
URL="$1" && shift 1
svn export ${URL/blob\/master/trunk}
)
This allows you to copy out from the github url without modification. Usage;
--- /tmp » git-scp https://github.com/dgraph-io/dgraph/blob/master/contrib/config/kubernetes/helm 1 ↵
A helm
A helm/Chart.yaml
A helm/README.md
A helm/values.yaml
Exported revision 6367.
--- /tmp » ls | grep helm
Permissions Size User Date Modified Name
drwxr-xr-x - anthony 2020-01-07 15:53 helm/
This looks far simpler:
git archive --remote=<repo_url> <branch> <path> | tar xvf -
What you are trying to do is called a sparse checkout, and that feature was added in git 1.7.0 (Feb. 2012). The steps to do a sparse clone are as follows:
mkdir <repo>
cd <repo>
git init
git remote add -f origin <url>
This creates an empty repository with your remote, and fetches all objects but doesn't check them out. Then do:
git config core.sparseCheckout true
Now you need to define which files/folders you want to actually check out. This is done by listing them in .git/info/sparse-checkout
, eg:
echo "some/dir/" >> .git/info/sparse-checkout
echo "another/sub/tree" >> .git/info/sparse-checkout
Last but not least, update your empty repo with the state from the remote:
git pull origin master
You will now have files "checked out" for some/dir
and another/sub/tree
on your file system (with those paths still), and no other paths present.
You might want to have a look at the extended tutorial and you should probably read the official documentation for sparse checkout.
As a function:
function git_sparse_clone() (
rurl="$1" localdir="$2" && shift 2
mkdir -p "$localdir"
cd "$localdir"
git init
git remote add -f origin "$rurl"
git config core.sparseCheckout true
# Loops over remaining args
for i; do
echo "$i" >> .git/info/sparse-checkout
done
git pull origin master
)
Usage:
git_sparse_clone "http://github.com/tj/n" "./local/location" "/bin"
Note that this will still download the whole repository from the server – only the checkout is reduced in size. At the moment it is not possible to clone only a single directory. But if you don't need the history of the repository, you can at least save on bandwidth by creating a shallow clone. See udondan's answer below for information on how to combine shallow clone and sparse checkout.
As of git 2.25.0 (Jan 2020) an experimental sparse-checkout command is added in git:
git sparse-checkout init
# same as:
git config core.sparseCheckout true
git sparse-checkout set "A/B"
# same as:
echo "A/B" >> .git/info/sparse-checkout
git sparse-checkout list
# same as:
cat .git/info/sparse-checkout
You can combine the sparse checkout and the shallow clone features. The shallow clone cuts off the history and the sparse checkout only pulls the files matching your patterns.
git init <repo>
cd <repo>
git remote add origin <url>
git config core.sparsecheckout true
echo "finisht/*" >> .git/info/sparse-checkout
git pull --depth=1 origin master
You'll need minimum git 1.9 for this to work. Tested it myself only with 2.2.0 and 2.2.2.
This way you'll be still able to push, which is not possible with git archive
.