C# Optimisation: Inserting 200 million rows into database

前端 未结 3 2024
不知归路
不知归路 2021-02-10 11:41

I have the following (simplified) code which I\'d like to optimise for speed:

long inputLen = 50000000; // 50 million 
DataTable dataTable = new DataTable();
Dat         


        
相关标签:
3条回答
  • 2021-02-10 12:10

    Instead of holding a huge data table in memory, I would suggest implementing a IDataReader which serves up the data as the bulk copy goes. This will reduce the need to keep everything in memory upfront, and should thus serve to improve performance.

    0 讨论(0)
  • 2021-02-10 12:15

    You should not construct entire datatable in memory. Use this overload of WrtieToServer, that takes array of DataRow. Just split in chunks your data.

    0 讨论(0)
  • 2021-02-10 12:29

    For such a big table, you should instead use the

    public void WriteToServer(IDataReader reader)
    

    method.

    It may mean you'll have to implement yourself a "fake" IDataReader interface with your code (if you' don't get the data from an existing IDataReader), but this way, you'll get "streaming" from end to end, and will avoid a 200 million loop.

    0 讨论(0)
提交回复
热议问题