I cloned a project from github with git clone --mirror. That left me with a repository with a packed-refs file, a .pack and an .idx file.
The reason the packed refs exist is to speed up access in a repo with zillions of refs - it's easier to look at a single file with many lines than to hit the file system once for every single ref. Anything in git which needs to know about refs goes through code which can read both the refs directory and the packed refs file. Unpacking it would defeat its purpose. If you want to access refs, use the plumbing commands (e.g. show-ref, for-each-ref, update-ref...). I can't really think of any kind of access which would be faster and easier with the directory structure than with the plumbing commands (especially with for-each-ref available).
And yes, packed objects are (like packed refs) created for improved performance, but there's a huge difference. A packed refs file is just a bunch of independent lines. You can, essentially for free, add to or remove from it. There's no need to unpack it in order to modify it. Packed objects, on the other hand, are delta-compressed, so the objects inside depend on each other. They greatly reduce disk usage, and objects can be read from them at reasonable cost, but attempting to modify the set of objects in the pack is much more expensive than modifying loose objects, so it's only done periodically by git repack
(called by git gc
), though I don't believe git repack
actually unpacks the objects - it just reads them from the packfile, packs them with the loose ones, and makes a new pack.
However, when a pack is transferred from a remote, it's unpacked on the local side. I see a call to an unpack method in the git receive-pack
source, and the pack-objects
manpage says:
The git unpack-objects command can read the packed archive and expand the objects contained in the pack into "one-file one-object" format; this is typically done by the smart-pull commands when a pack is created on-the-fly for efficient network transport by their peers.