Is there any good way to handle large assets (i.e. 1000\'s of images, flash movies etc.) with a DVCS tool such as hg and git. As I see it, to clone repositories tha
Thoughts, no experience: I would indeed seperate code from data. Assuming that there is a set of images that belongs to the application, I would just keep that on a centralized server. In the code, I would then arrange (through explicit coding) that the application can integrate both local or remote assets. People contributing can then put new images in their local store at first, integrating it with some kind of (explicit) upload procedure into the central store when required and approved.
One fairly popular option within the game development industry (with huge repositories) is to use Plastic SCM.
They have options to store blobs in the file system instead of the database.
https://www.plasticscm.com
I've struggled with this myself. As you said, versioning GBs of assets can be a huge pain.
For projects that require external participation I've found Mercurial to be a working solution, but not a great one. It eats up disks space for large files and can be fairly slow depending on the circumstances.
For my in-house design work I prefer to use simple syncing tools (rsync, synctoy, whatever else) to keep directories up-to-date between servers/machines and then do version control manually. I find I rarely need to version-control for anything beyond major revisions.
These are some thoughts I've had on this matter of subject. In the end you may need to keep assets and code as separate as possible. I can think of several possible strategies:
Assets in one repo and code in the other.
DVCS tools don't keep track of other repositories than their own so there isn't any direct BOM (Bill of Materials) support, i.e. there is no clear cut way to tell when both repositories are in sync. (I guess this is what git-submodule or repo is for).
Example: artist adds a new picture in one repository and programmer adds function to use the picture, however when someone has to backtrack versions they are forced to somehow keep track of these changes on their own.
Asset repository overhead even though it only affects those who do use it.
Assets and code reside in the same repository but they are in two separate directories.
Both strategies listed above still have the disadvantage of having a large overhead since you need to clone the large asset repository. One solution to this problem is a variant of the first strategy above, two repositories; keep the code in the distributed VCS repo and the assets in a centralized VCS repo (such as SVN, Alienbrain, etc).
Considering how most graphic designers work with binary files there is usually no need to branch unless it is really necessary (new features requiring lots of assets that isn't needed until much later). The disadvantage is that you will need to find a way to back up the central repository. Hence a third strategy:
Code in repository as usual and assets are not in repository. Assets should be put in some kind of content/media/asset management system instead or at least is on a folder that is regularly backed up. This assumes that there is very little need to back-track versions with graphics. If there is a need for back-tracking then graphic changes are negligible.
Maybe GIT LFS should be mentioned in this context (see also Atlassian's git lfs tutorial)