I\'m setting up a multi-user, multi-server environment. All developers will use Git and clone various repos from GitHub etc. (in one account I control).
Now, how do
How do most people get GitHub files to their web servers?
Generally by pushing to a bare repo on the web server, with a post-receive
hook ready to checkout that repo on the site working tree, similar to:
git --git-dir=/path/to/bare_repo.git --work-tree=/path/to/website/httpdocs checkout -f
You can also pull from the bare repo itself, through a cron job for instance:
*/10 * * * * user /home/usern/git-pull-requests/fetch.sh
But pull or push mean that git is installed on the server.
If you don't want that, you can use git archive to create an archive (zip or tar), and copy over that for a custom script to uncompress it.