I have calculated directed modularity by means of DirectedLouvain (https://github.com/nicolasdugue/DirectedLouvain). I am now trying to test the significance of the values o
You could use GNU Parallel to run your jobs in parallel across all your CPU cores like this:
parallel convert -i {} -o {.}.bin -w {.}.weights ::: input*.txt
Initially, you may like to do a "dry run" that shows what it would do without actually doing anything:
parallel --dry-run convert -i {} -o {.}.bin -w {.}.weights ::: input*.txt
If you get errors about the argument list being too long because you have too many files, you can feed their names in on stdin
like this instead:
find . -name "input*txt" -print0 | parallel -0 convert -i {} -o {.}.bin -w {.}.weights
You can use find
to list your files and execute a command on all of them:
find -name '*.ext' -exec ./runThisExecutable '{}' \;
If you have a.ext
and b.ext
in a directory, this will run ./runThisExecutable a.ext
and ./runThisExecutable b.ext
.
To test whether it identifies the right files, you can run it without -exec
so it only prints the filenames:
find -name '*.ext'
./a.ext
./b.ext
So:
find -type f -name '*.ext' |
while IFS= read -r file; do
file_no_extension=${file##*/};
file_no_extension=${file_no_extension%%.*}
./convert -i "$file" -o "$file_no_extension".bin -w "$file_no_extension".weights
done
// with find:
find -type f -name '*.ext' -exec sh -c 'f=$(basename "$1" .ext); ./convert -i "$1" -o "$f".bin -w "$f".weights' _ {} \;
// with xargs:
find -type f -name '*.ext' |
xargs -d '\n' -n1 sh -c 'f=$(basename "$1" .ext); ./convert -i "$1" -o "$f".bin -w "$f".weights' _