Import Multiple .sql dump files into mysql database from shell

前端 未结 5 1795
半阙折子戏
半阙折子戏 2020-12-12 14:06

I have a directory with a bunch of .sql files that mysql dumps of each database on my server.

e.g.

database1-2011-01-15.sql
database2-2         


        
相关标签:
5条回答
  • 2020-12-12 14:25

    cat *.sql | mysql? Do you need them in any specific order?

    If you have too many to handle this way, then try something like:

    find . -name '*.sql' | awk '{ print "source",$0 }' | mysql --batch
    

    This also gets around some problems with passing script input through a pipeline though you shouldn't have any problems with pipeline processing under Linux. The nice thing about this approach is that the mysql utility reads in each file instead of having it read from stdin.

    0 讨论(0)
  • 2020-12-12 14:29

    I don't remember the syntax of mysqldump but it will be something like this

     find . -name '*.sql'|xargs mysql ...
    
    0 讨论(0)
  • 2020-12-12 14:30

    One-liner to read in all .sql files and imports them:

    for SQL in *.sql; do DB=${SQL/\.sql/}; echo importing $DB; mysql $DB < $SQL; done
    

    The only trick is the bash substring replacement to strip out the .sql to get the database name.

    0 讨论(0)
  • 2020-12-12 14:33

    I created a script some time ago to do precisely this, which I called (completely uncreatively) "myload". It loads SQL files into MySQL.

    Here it is on GitHub

    It's simple and straight-forward; allows you to specify mysql connection parameters, and will decompress gzip'ed sql files on-the-fly. It assumes you have a file per database, and the base of the filename is the desired database name.

    So:

    myload foo.sql bar.sql.gz
    

    Will create (if not exist) databases called "foo" and "bar", and import the sql file into each.

    For the other side of the process, I wrote this script (mydumpall) which creates the corresponding sql (or sql.gz) files for each database (or some subset specified either by name or regex).

    0 讨论(0)
  • 2020-12-12 14:34

    There is superb little script at http://kedar.nitty-witty.com/blog/mydumpsplitter-extract-tables-from-mysql-dump-shell-script which will take a huge mysqldump file and split it into a single file for each table. Then you can run this very simple script to load the database from those files:

    for i in *.sql
    do
      echo "file=$i"
      mysql -u admin_privileged_user --password=whatever your_database_here < $i
    done
    

    mydumpsplitter even works on .gz files, but it is much, much slower than gunzipping first, then running it on the uncompressed file.

    I say huge, but I guess everything is relative. It took about 6-8 minutes to split a 2000-table, 200MB dump file for me.

    0 讨论(0)
提交回复
热议问题