I\'d like to generate dummy files in bash. The content doesn\'t matter, if it was random it would be nice, but all the same byte is also acceptable.
My first attempt was
If your file system is ext4, btrfs, xfs or ocfs2, and if you don't care about the content you can use fallocate
. It's the fastest method if you need big files.
fallocate -l 100KB dummy_100KB_file
See "Quickly create a large file on a Linux system?" for more details.
Easy way:
make file test and put one line "test"
Then execute:
cat test >> test
ctrl+c after a minute will result in plenty of gigabytes :)
You may use dd for this purpose:
dd if=/dev/urandom bs=1024 count=5 of=dummy
Note, that
x=`expr $x + 1`;
isn't the most efficient way to calculation in bash. Do arithmetic integer calculation in double round parenthesis:
x=((x+1))
But for an incremented counter in a loop, there was the for-loop invented:
x=0;
while [ $x -lt 100000 ];
do echo a >> dummy.zip;
x=`expr $x + 1`;
done;
in contrast to:
for ((x=0; x<100000; ++x))
do
echo a
done >> dummy.zip
Here are 3 things to note:
But there is still a more simple form of the for-loop:
for x in {0..100000}
do
echo a
done >> dummy.zip
echo "To print the word in sequence from the file"
c=1
for w in cat file
do
echo "$c . $w"
c = expr $c +1
done
This will generate a text file 100,000 bytes large:
yes 123456789 | head -10000 > dummy.file
Possibly
dd if=/dev/zero of=/dummy10MBfile bs=1M count=10