How to duplicate a MySQL database on the same server

前端 未结 6 1930
不知归路
不知归路 2021-01-31 01:51

I have a large MySQL database, lets call it live_db, which I want to replicate on the same machine to provide a test system to play around with (test_db

6条回答
  •  有刺的猬
    2021-01-31 02:01

    I have been doing this for several years in different contexts, with small to medium databases (1 G to 100 G). The quick and dirty mysqldump works for smaller data sets; the smaller they are the better it goes.

    When you exceed 5-10 GB, depending on the MySQL load, quick and dirty does not cut it anymore.

    mysqldump

    The problem with MySQLdump is that while it dumps, the live database is either unusable, very awkward to use, or the backup will not be consistent. Unless you have a wide enough time window when the unusability of the live database is not important because the database needs not be in use anyway (for example late at night).

    The default options (here a discussion of the why) make the database next to unusable while it's being dumped, unless the usage is just reading data and little of that. On a busy ecommerce site, you're looking at a client pile-up crash.

    So you use InnoDB and the modern options (not defaults, as far as I know)

    --single-transaction --skip-lock-tables
    

    which allow the site to run, albeit slower than normal, during the dump. Depending on what the usage is, it might be noticeably slower.

    While you're at it, also dump other data which might be important:

    --events --triggers --routines
    

    (...oh, and this still won't have dumped user permissions. To use as a test perhaps it wasn't so important).

    There is a workaround I've found "advised" (!) as a "great hack", which basically disables transactional integrity allowing the database to run at full speed while it's being dumped. Somewhat like removing the brakes from your car to lighten it and have it run faster, yeah it will work, but it'll have some side effects that you might not notice immediately. You will almost surely notice them - and it will be when you'll need them most, and it won't be pretty.

    However, for a test database, it could still work.

    Xtrabackup

    If you have a "serious" database, what's the reason not to have a "serious" backup?

    Slave replication

    Another possibility if you have space to spare - and, nowadays, 20 Gb is not that much - is that of using an ancillary database.

    You can install a second copy of MySQL Server on a different port, and have it be the slave. Then you will have two identical databases (live master, live slave). The first time you will still have to run a full dump, with all the problems it involves.

    When you need to clone the test database, stop the slave replication - the live slave will now remain "frozen" in time - and backup the live slave to the test db, using MySQLbackup or just copying the data files. Once done, you restart the replication.

    The impact on the live master is negligible, and the slave can actually be used for non-update-critical selects.

提交回复
热议问题