问题
I'm using Perl's Archive::Tar module. Problem with it is it pulls everything on to the memory and the does archiving and then writes on to the file system so there is limitation on the maximum file size that can be archived. Most of the times it says out of memory. In case of GNU tar, it takes chunk of file, archives it and writes it on to the memory so it can handle files of any size. How can I do that using Perl's Archive::Tar module.
回答1:
It looks like there is a different module that doesn't use an in-memory structure: Archive::Tar::Streamed. The downside is that it requires tar to be available on the system it is run on. Still, it is better than puppet-stringing tar yourself.
回答2:
It looks like Archive::Tar::Wrapper is your best bet. I've not tried it myself, but it uses your system's tar
executable and doesn't keep files in memory.
Contrary to Chas. Owen's answer, Archive::Tar::Streamed does keep files in memory and does not use your system's tar
. It actually uses Archive::Tar internally, but it processes one file at a time (taking advantage of the fact that tar archives can be concatenated). This means that Archive::Tar::Streamed can handle archives bigger than memory, as long as each individual file in the archive will fit in memory. But that's not what you asked for.
来源:https://stackoverflow.com/questions/653127/how-can-i-tar-files-larger-than-physical-memory-using-perls-archivetar