Al my html files reside here :
/home/thinkcode/myfiles/html/
I want to move the newest 10 files to /home/thinkcode/Test
I
Here is a version which doesn't use ls
. It should be less vulnerable to strange characters in file names:
find . -maxdepth 1 -type f -name '*.html' -print0
\| xargs -0 stat --printf "%Y\t%n\n"
\| sort -n
\| tail -n 10
\| cut -f 2
\| xargs cp -t ../Test/
I used find
for a couple of reasons:
1) if there are too many files in a directory, bash will balk at the wildcard expansion*.
2) Using the -print0
argument to find
gets around the problem of bash expanding whitespace in a filename in to multiple tokens.
* Actually, bash shares a memory buffer for its wildcard expansion and its environment variables, so it's not strictly a function of the number of file names, but rather the total length of the file names and environment variables. Too many environment variables => no wildcard expansion.
EDIT: Incorporated some of @glennjackman's improvements. Kept the initial use of find
to avoid the use of the wildcard expansion which might fail in a large directory.
ls -lt *.html | head -10 | awk '{print $NF}' | xargs -i cp {} DestDir
In the above example DestDir
is the destination directory for the copy.
Add -t
after xargs to see the commands as they execute. I.e., xargs -i -t cp {} DestDir
.
For more information check out the xargs command.
EDIT: As pointed out by @DennisWilliamson (and also checking the current man page) re the -i
option This option is deprecated; use -I instead.
.
Also, both solutions presented depend on the filenames in questions don't contain any blanks or tabs.
cp
seems to understand back-ticked commands. So you could use a command like this one to copy the 10 latest files to another folder like e.g. /test
:
cp `ls -t *.htm | head -10` /test
ls -lt *.htm | head -10 | awk '{print "cp " $9 " ../Test/"$9}' | sh