How to optimize MySQL to insert millions row?

后端 未结 2 643
旧巷少年郎
旧巷少年郎 2020-12-25 10:08

I need to insert millions rows into the MySQL database (InnoDB engine). I have a problem with time when the tables have big sizes. Almost all time is spent on insert queries

相关标签:
2条回答
  • 2020-12-25 10:48

    If your talking about a large number of INSERT statements, look into something called transactions. I'm quite certain that most (if not all) languages that do SQL support transactions. They will speed up anything involving writing to the DB. An added bonus is that if something goes wrong you can rollback the changes.

    0 讨论(0)
  • 2020-12-25 10:54

    To import large bulk of data into InnoDB:

    1. set in MySQL configuration

      • innodb_doublewrite = 0
      • innodb_buffer_pool_size = 50%+ system memory
      • innodb_log_file_size = 512M
      • log-bin = 0
      • innodb_support_xa = 0
      • innodb_flush_log_at_trx_commit = 0
    2. Add right after transaction start:

      SET FOREIGN_KEY_CHECKS = 0;

      SET UNIQUE_CHECKS = 0;

      SET AUTOCOMMIT = 0;

    3. Set right before transaction end:

      SET UNIQUE_CHECKS = 1;

      SET FOREIGN_KEY_CHECKS = 1;

    0 讨论(0)
提交回复
热议问题