I\'m already familiar with answers to the general question, how to manage large binary files in git (or a large number of files). And I\'ve looked at git-annex, bup, and git
Not that I know of, and as you know, large binary files aren't compatible with:
The only remaining solution (OS agnostic actually) remains an external artifact repository (like a Nexus one for instance) to store those binaries.
The OP Anthony Mastrean adds that he needs to:
version control my system deployment bits: OS images, drivers, 3rd party installers, 1st party installers (our applications).
I need to have everything in a coherent bundle (tags). And be able to get the entire bundle for any of our active releases
That would be mixing:
Anything which isn't developed (i.e anything built or already existing) should be out of a VCS (except for very small resources, like icons for instance, which don't change much).
What you usually version is a "release file" which contains all the extra information (checksums, path to other referentials, ...) for the deployment script to operate, fetching the right artifacts.
There is not good way to manage large binary files in GIT or any other version control system. The consensus is that you need digital asset management systems to do this. Digital assets are things like photos, sound clips, videos, etc.
There are a number of Open Source DAM packages out there and this page has a review of all the major ones http://www.opensourcedigitalassetmanagement.org/
If you don't need support for versioning, lot's of people build quick solutions using something like MongoDB for storage.