mysqldump - Dump multiple databases from separate mysql accounts to one file

后端 未结 2 1330
感情败类
感情败类 2021-02-03 10:52

The standard mysqldump command that I use is

mysqldump --opt --databases $dbname --host=$dbhost --user=$dbuser --password=$dbpass | gzip > $filename


        
相关标签:
2条回答
  • 2021-02-03 11:35
    1. For every MySQL server account, dump the databases into separate files

    2. For every dump file, execute this command:

      cat dump_user1.sql dump_user2.sql | gzip > super_dump.gz

    There is a similar post on Superuser.com website: https://superuser.com/questions/228878/how-can-i-concatenate-two-files-in-unix

    0 讨论(0)
  • 2021-02-03 11:52

    Nobody seems to have clarified this, so I'm going to give my 2 cents.

    Going to note here, my experiences are in BASH, and may be exclusive to it, so variables and looping might work different in your environment.

    The best way to achieve an archive with separate files inside of it is to use either ZIP or TAR, i prefer to use tar due to its simplicity and availability.

    Tar itself doesn't do compression, but bundled with bzip2 or gzip it can provide excellent results. Since your example uses gzip I'll use that in my demonstration.

    First, let's attack the problem of MySQL dumps, the mysqldump command does not separate the files (to my knowledge anyway). So let's make a small workaround for creating 1 file per database.

    mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; done
    

    So now we have a string that will show databases per file, and export those databases out to where ever you need simply edit the part after the > symbol

    Next, let's add some look at the syntax for TAR

    tar -czf <output-file> <input-file-1> <input-file-2>
    

    because of this configuration it allows us to specify a great number of files to archive.

    The options are broken down as follows.

    c - Compress/Create Archive

    z - GZIP Compression

    f - Output to file

    j - bzip compression

    Our next problem is keeping a list of all the newly created files, we'll expand our while statement to append to a variable while running through each database found inside of MySQL.

    DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB";  done 
    

    Now we have a DBLIST variable that we can use to have an output of all our files that will be created, we can then modify our 1 line statement to run the tar command after everything has been handled.

    DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB";  done && tar -czf $filename "$DBLIST"
    

    This is a very rough approach and doesn't allow you to manually specify databases, so to achieve that, using the following command will create you a TAR file that contains all of your specified databases.

    DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql";  done && tar -czf $filename "$DBLIST"
    

    The looping through MySQL databases from the MySQL database comes from the following stackoverflow.com question "mysqldump with db in a separate file" which was simply modified in order to fit your needs.

    And to have the script automatically clean it up in a 1 liner simply add the following at the end of the command

    && rm "$DBLIST"
    

    making the command look like this

    DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql";  done && tar -czf $filename "$DBLIST" && rm "$DBLIST"
    
    0 讨论(0)
提交回复
热议问题