I\'m trying to use an array to store a list of file names using the find
command.
For some reason the array fails to work in the bash used by the school
I was having issue with Johannes Weiß's solution, if I was just doing an echo it would work for the full list of files. However, if I tried running ffmpeg on the next line the script would only process the first file it encountered. I assumed some IFS funny business due to the pipe but I couldn't figure it out and ran with a for loop instead:
for i in $(find . -name '*.mov' );
do
echo "$i"
done
find . -name '*.txt' | while IFS= read -r FILE; do
echo "Copying $FILE.."
cp "$FILE" /destination
done
I think starpause has the cleanest solution, however it fails when there is whitespaces in paths. This is fixed by setting IFS
. The correct answer is therefore:
IFS=$'\n'
for i in $(find . -name '*.mov' );
do
echo "$i"
done
unset IFS
You unset IFS in order to reset behaviour for IFS and as to why the $
is needed in IFS=$'\n'
, see https://unix.stackexchange.com/questions/184863/what-is-the-meaning-of-ifs-n-in-bash-scripting
You could use something like that:
find . -name '*.txt' | while read line; do
echo "Processing file '$line'"
done
E.g. make a copy:
find . -name '*.txt' | while read line; do
echo "Copying '$line' to /tmp"
cp -- "$line" /tmp
done
HTH
Just don't put blanks around the equals sign:
ar=($(find . -name "*.txt"))
Avoid backticks, if possible, since they're deprecated. They can be easily confused with apostroph, especially in poor fonts, and they don't nest so well.
In most cases you will be best served if you iterate through a find-result directly with -exec, -execdir, -ok or -okdir.
For and while loops are hard to do right when it comes to blanks in filenames or newlines and tabs.
find ./ -name "*.txt" -exec grep {} ";"
The {} doesn't need masking. You will often see a combination find/xargs which starts an additional process too:
find ./ -name "*.txt" | xargs grep {} ";"