Sorry to come up with this topic again, as there are soo many other questions already related - but none that covers my problem directly.
What I\'m searching is a good v
- store large binary files (>1GB)
- support a repository that's >1TB (yes, that's TB)
Yep, that is one of the cases Apache Subversion should fully support.
So far I've got some experience with SVN and CVS, however I'm not quite satisfied with the performance of both with large binary files (a few MSI or CAB files will be >1GB). Also, I'm not sure if they scale well with the amount of data we're expecting in the next 2-5 years (like I said, estimated >1TB)
Up-to-date Apache Subversion servers and clients should have no problems controlling such amount of data and they perfectly scale. Moreover, there are various repository replication approaches that should improve performance in case you have multiple sites with developers working on the same projects.
I'm currently also looking into SVN Externals as well as Git Submodules, though that would mean several individual repositories for each software package and I'm not sure that's what we want..
svn:externals
have nothing to do with the support for large binaries or multiterabyte projects. Subversion perfectly scales and supports very large data and code base in a single repository. But Git does not. With Git, you'll have to divide and split the projects to multiple small repositories. This is going to lead to a lot of drawbacks and a constant PITA. That's why Git has a lot of add-ons such as git-lfs that try to make the problem less painful.
This is an old question, but one possible answer is https://www.plasticscm.com/. Their VCS can handle very large files and very large repositories. They were my choice when we were choosing a couple years ago, but management pushed us elsewhere.
The perks that come with a versioning system (changelog, easy rss access etc.) are nonexistant on a simple fileshare.
If you only care about the versioning metadata features and don't actually care about the old data then a solution that uses a VCS without storing the data in the VCS may be an acceptable option.
git-annex is the first one that came to my mind, but from the what git-annex is not page it seems there are other similar but not exactly the same alternatives.
I have not used git-annex, but from the description and walkthrough it sounds like it could work for your situation.
Version control systems are for source code, not binary builds. You are better off just using standard network file server backup tapes for binary file backup - even though it's largely unnecessary when you have source code control since you can just rebuild any version of any binary at any time. Trying to put binaries in source code control is a mistake.
What you are really talking about is a process known as configuration management. If you have thousands of unique software packages, your business should have a configuration manager (a person, not software ;-) ) who manages all of the configurations (a.k.a. builds) for development, testing, release, release-per-customer, etc.
There are a couple of companies with products for "Wide Area File Sharing." They can replicate large files to different locations, but have distributed locking mechanisms so only one person can work on any of the copies. When a person checks in an updated copy, that is replicated to the other sites. The major use is CAD/CAM files and other large files. See Peer Software (http://www.peersoftware.com/index.aspx) and GlobalSCAPE (http://www.globalscape.com/).
Old question, but perhaps worth pointing out that Perforce is in use at lots of large companies, and particular in games development companies, where multi-Terabyte repositories with many large binary files.
(Disclaimer: I work at Perforce)