I am looking for opinions of how to handle large binary files on which my source code (web application) is dependent. We are currently discussing several alternatives:
You can also use git-fat. I like that it only depends on stock Python and rsync. It also supports the usual Git workflow, with the following self explanatory commands:
git fat init
git fat push
git fat pull
In addition, you need to check in a .gitfat file into your repository and modify your .gitattributes to specify the file extensions you want git fat
to manage.
You add a binary using the normal git add
, which in turn invokes git fat
based on your gitattributes rules.
Finally, it has the advantage that the location where your binaries are actually stored can be shared across repositories and users and supports anything rsync
does.
UPDATE: Do not use git-fat if you're using a Git-SVN bridge. It will end up removing the binary files from your Subversion repository. However, if you're using a pure Git repository, it works beautifully.