I have successfully dumped an entire MySQL database using
mysqldump --databases
generating a nice .txt file. However, I can't see how to read the whole file back into MySQL in one go; mysqlimport seems to want just one table at a time.
When you've generated some file (say db-dump.sql
) with mysqldump
, you can import it to your other database with the mysql
command :
mysql --user=XXX --password=XXX --host=YOUR_HOST DATABASE_NAME < db-dump.sql
And, if you don't want the password to appear in a command, you can use :
mysql --user=XXX -p --host=YOUR_HOST DATABASE_NAME < db-dump.sql
As a sidenote, if you want to copy one DB to another one, you don't need to use a file, and can just directly pipe the output of mysqldump
to mysql
:
mysqldump --user=XXX --password=XXX --host=SOURCE_HOST SOURCE_DB | mysql --user=XXX --password=XXX --host=DESTINATION_HOST DESTINATION_DB
(It should even be faster, as you're not using a temporary file that resides on disk)
I do this frequently:
mysqldump -u username -p databasename > dump.sql
To load:
mysql -u username -p targetdatabasename < dump.sql
Switch -p
instructs the database to prompt you to enter the password for the user username
once the command launches.
Your question is probably duplicate though.
You can just use 'source' from within the mysql client:
source dumpfile.sql
Or supply directly from command line:
mysql -u user -p password database < source dumpfile.sql
This is because the result of mysqldump is just a SQL file that can be run via mysql as usual.
来源:https://stackoverflow.com/questions/2298176/how-to-use-mysqlimport-to-read-in-result-of-mysqldump-databases