I\'m working with another developer on the other side of the country who is the lead on our project, and isn\'t using a formal version control system. On my end, I\'m track
this, altho not entirely sure a good idea, is what i'm thinking of doing in almost the same situation:
what about basically maintaining a repo for them?
any changes you receive get placed in their repo and committed. then merge their repo, which would be a tracked remote, into yours when you need to?
Check out if contrib/fast-import/import-zips.py
would do what you want.
The binaries, if stored in Git, are bound to create a new version (i.e. to be taken into account at the next commit).
So: do you need those binaries, or can you rebuild them ?
As for the sources, in Git, SHA1 is king, and since the date of the file (timestamps) is involved in its calculation, since the external set of files can be fairly different in its content (more files, temporary files, files that should be ignored, ...), it would be best to:
Thanks to rq for pointing out that timestamps are not part of SHA1 computation. Only the:
are part of SHA1 computation:
(source: alexgirard.com)
However, when importing large set of files externally managed into a git repository, you risk adding new files into git-managed directories, changing their content, hence their SHA1 key, even though old git-managed files have not changed.
That means a lot of changes in tree are artificial if those new files are just temporary files or files that should be ignored/recreated/regenerated anyway.
(source: alexgirard.com)
The above process just ensure an external way to detect what has changed between an external set of files and a git-managed set of file and judge if it has to be part of the git working directory or not.