I used this script for years on my VPS. And it\'s still working.
DBLIST=`mysql -uroot -pROOT_PASSWORD -ANe\"SELECT GROUP_CONCAT(schema_name) FROM information
Faced with the same problem. I do not know why exactly, but if you add the utility PV concluded that all works. Maybe it depends on your shell bash/sh.
sudo apt-get install pv
PipeViewer it a very usefull utility, it allows you to visualize processes of writing to disk, for example.
Script for example
mysqldump ${MYSQLDUMP_OPTIONS} ${DB} | gzip | pv > ${BACKUP_DEST}/${DB}.sql.gz
I was using mysqldump
from the CLI and trying to pipe to gzip and/or a file and getting a "permission denied" error.
Even as sudo
, I was getting an error because although I was running mysqldump
as sudo
, the pipe was still trying to use the user account I was logged in to the shell as to write the output. In this case, my shell user account did not have permissions to write to the target directory.
To get around this, you can use the tee
command in conjunction with sudo
:
mysqldump --single-transaction --routines --events --triggers --add-drop-table --extended-insert -u backup -h 127.0.0.1 -p --all-databases | gzip -9 | sudo tee /var/backups/sql/all_$(date +"%Y_week_%U").sql.gz > /dev/null
The | sudo tee /var/backups/...
is what lets us pipe to a directory that is only writable by root
. The > /dev/null
suppresses tee
from dumping its output directly to the screen.
20:47:59 0 ~] $ perror 32
OS error code 32: Broken pipe
So errno 32 is "broken pipe". You're piping the mysqldump output to gzip
, so this means gzip terminated prior to mysqldump finished. Could e.g. be because your disk is full, or gzip surpassed any max CPU time/usage your host has in place.
I was seeing this error, when piping mysqldump output to s3cmd. It was caused by using the wrong version of s3cmd. On Ubuntu Trusty and Debian Wheezy the packaged version of s3cmd command doesn't support stdin (because they have version 1.1.0).
Its so old topic, but I'm facing that problem and find that:
My file name: db_26/03.tar.gz
its raising an error like above; but when I use: db.tar.gz
there is no error.
So you should check your file name
Check if the folder exist in your location, /home/backup/db/
if no, create every subfolder.
Command: mkdir /home/backup/db/
then run your command again.