How do I insert a large number of rows in MySQL?

后端 未结 4 614
抹茶落季
抹茶落季 2021-01-17 22:03

How do I insert for example 100 000 rows into MySQL table with a single query?

相关标签:
4条回答
  • 2021-01-17 22:13

    You can't as far as I know. You will need a loop.

    0 讨论(0)
  • 2021-01-17 22:21

    Try to use LoadFile() or Convert yor Data into XML file and then use Load and Extract() function to Load Data into MySQL database.

    This is the One Query and Fastest option,

    Even I'm doing the same,I had files if 1.5 GB around millions of rows. I have Used Both Option in my case.

    0 讨论(0)
  • 2021-01-17 22:27
    insert into $table values (1, a, b), (2, c, d), (3, e, f);
    

    That will perform an insertion of 3 rows. Continue as needed to reach 100,000. I do blocks of ~1,000 that way when doing ETL work.

    If your data is statically in a file, transforming it and using load data infile will be the best method, but I'm guessing you're asking this because you do something similar.

    Also note what somebody else said about the max_allowed_packet size limiting the length of your query.

    0 讨论(0)
  • 2021-01-17 22:36

    You can do a batch insert with the INSERT statement, but your query can't be bigger than (slightly less than) max_allowed_packet.

    For 100k rows, depending on the size of the rows, you'll probably exceed this.

    One way would be to split it up into several chunks. This is probably a good idea anyway.

    Alternatively you can use LOAD DATA INFILE (or LOAD DATA LOCAL INFILE) to load from a tab-delimited (or other delimited) file. See docs for details.

    LOAD DATA isn't subject to the max_allowed_packet limit.

    0 讨论(0)
提交回复
热议问题