I have a php script that parses XML files and creates a large SQL file that looks something like this:
INSERT IGNORE INTO table(field1,field2,field3...)
VALUES ("value1","value2",int1...),
("value1","value2",int1)...etc
This file adds up to be over 20GB (I've tested on a 2.5GB file but it fails too).
I've tried commands like:
mysql -u root -p table_name < /var/www/bigfile.sql
this works on smaller files, say around 50MB. but it doesn't work with a larger file.
I tried:
mysql> source /var/www/bigfile.sql
I also tried mysqlimport but that won't even properly process my file.
I keep getting an error that says
ERROR 2006 (HY000): MySQL server has gone away
Happens approx. 30 seconds after I start executing.
I set allowed_max_packet to 4GB but when verifying it with SHOW VARIABLES it only shows 1GB.
Is there a way to do this without wasting another 10 hours?
Try splitting the file into multiple INSERT queries.
来源:https://stackoverflow.com/questions/9337855/how-to-import-large-sql-files-into-mysql-table