Is there a way to limit the amount of memory that “git gc” uses?
I'm hosting a git repo on a shared host. My repo necessarily has a couple of very large files in it, and every time I try to run "git gc" on the repo now, my process gets killed by the shared hosting provider for using too much memory. Is there a way to limit the amount of memory that git gc can consume? My hope would be that it can trade memory usage for speed and just take a little longer to do its work. Yes, have a look at the help page for git config and look at the pack.* options, specifically pack.depth , pack.window , pack.windowMemory and pack.deltaCacheSize . It's not a totally exact