SqlBulkCopy from a List<>

依然范特西╮ 提交于 2019-12-17 04:54:26

问题


How can I make a big insertion with SqlBulkCopy from a List<> of simple object ?

Do I implement my custom IDataReader ?


回答1:


Simply create a DataTable from your list of objects and call SqlBulkCopy.WriteToServer, passing the data table.

You might find the following useful:

  • Adding columns to a DataTable. Add a column for each property/field you wish to write.
  • Adding rows to a DataTable. Add a row for each object in your list.

For maximum performance with SqlBulkCopy, you should set an appropriate BatchSize. 10,000 seems to work well - but profile for your data.

You might also observe better results when using SqlBulkCopyOptions.TableLock.

An interesting and informative analysis of SqlBulkCopy performance can be found here.




回答2:


With FastMember, you can do this without ever needing to go via DataTable (which, in my tests, more-than-doubles the performance):

using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
    bcp.DestinationTableName = "SomeTable";
    bcp.WriteToServer(reader);
}

Note that ObjectReader can also work with non-generic sources, and it is not necessary to specify the member-names in advance (although you probably want to use the ColumnMappings aspect of SqlBulkCopy if you don't specify them in the ObjectReader itself).




回答3:


Late to the party, but if you add this EntityDataReader class from Microsoft, there's an AsDataReader() extension method that does exactly that: https://github.com/matthewschrager/Repository/blob/master/Repository.EntityFramework/EntityDataReader.cs

(example [List].AsDataReader() implementation:)

var connStr = "";
using (var connection = new SqlConnection(connStr)) 
{
    var startTime = DateTime.Now;
    connection.Open();
    var transaction = connection.BeginTransaction();
    try
    {
        //var connStr = connection.ConnectionString;
        using (var sbCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
        {
            sbCopy.BulkCopyTimeout = 0;
            sbCopy.BatchSize = 10000;
            sbCopy.DestinationTableName = "Foobars";
            var reader = Foobars.AsDataReader();
            sbCopy.WriteToServer(reader);
        }
        transaction.Commit();
    }
    catch (Exception ex)
    {
        Console.WriteLine(ex.Message);
        transaction.Rollback();
    }
    finally
    {
        transaction.Dispose();
        connection.Close();
        var endTime = DateTime.Now;
        Console.WriteLine("Upload time elapsed: {0} seconds", (endTime - startTime).TotalSeconds);
    }
}



回答4:


Depending on what you are trying to accomplish by calling SqlBulkCopy in the first place, it might make more sense to use a Table-Valued Parameter (TVP). Using a TVP would make it trivial to send in a collection of any custom type. The data can be streamed in so you can avoid the DataTable (much like in @Marc Gravell's answer) and you can avoid SqlBulkCopy as well. TVP's allow for completely flexibility of how to handle the data once it gets to SQL Server as you call a Stored Procedure to pass the TVP data into and it appears as a Table Variable that you can do anything with, not just INSERT (which is the case with SqlBulkCopy). You can also get data back via a SqlDataReader, data such as newly created IDENTITY values. I added an example and some additional notes on this answer: How can I insert 10 million records in the shortest time possible?. And several years ago I wrote an article on SQL Server Central (free registration required), Streaming Data Into SQL Server 2008 From an Application, which is also noted in that linked answer, providing a working example of passing in a Generic List of a custom type, streamed in from a 3 million row text file.



来源:https://stackoverflow.com/questions/3913371/sqlbulkcopy-from-a-list

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!