Mysqldump Split Table dump into smaller files

好久不见. 提交于 2019-12-12 03:49:21

问题


Using mysqldump, I can dump my Databases and tables into separate files.

My question is, is there a way within mysqldump to split up these table files into smaller parts?

So instead of one 10GB .sql file, I would get ten 1GB .sql files for the same table?


回答1:


You can use MySQL Dump to grab data from a query however I've always found this hard to manage when you need split data into a specific size chunk.

As you want 1Gb files, here is how I would split the table up into 1Gb segments.

I've used INTO OUTFILE however MySQL dump could also be used at this stage

SELECT * FROM table
ORDER BY adminid ASC
INTO OUTFILE 'c:/table.csv'
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY

If you're using windows this really lacks a good split utility so I would suggest the GNU Core Utilities bundle http://gnuwin32.sourceforge.net/packages/coreutils.htm

After installing you can use split from the command prompt

cd C:\Program Files (x86)\GnuWin32\bin
split -C 1024m -d c:\table.csv c:\table.part_

If you're using Linux you've already got access to a good split util.

If you export them you will probably want to import them again at some point - that is where the .part_ at the end of the line is important, as mysqlimport tries to figure out the table name to import to, the . can be used to split the table but allow multiple files to import to the same database table.

These can then be imported using

mysqlimport --local --compress --user=username --password=password --host=dbserver.host --fields-terminated-by=, --fields-optionally-enclosed-by="\"" --lines-terminated-by="\n" databasename c:\table.csv

--local is needed otherwise mysqlimport wants to find the files on the remote host

--compress is vital as it saves a lot of bandwidth




回答2:


You can split @jason the full dump into tables and databases. You can use mysql-dump-splitter to extract table / database of your choice. Also during your dump process you can use filters as follows:

Dump all data for 2015:
mysqldump --all-databases --where= "DATEFIELD >= '2014-01-01' and DATEFIELD < '2015-01-01' " | gzip > ALLDUMP.2015.sql.gz

Provided you should have DATEFIELD column in all tables!! Alternatively you can also specify the ID column to restrict dump to only extract IDs of specified range.



来源:https://stackoverflow.com/questions/15661887/mysqldump-split-table-dump-into-smaller-files

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!