Any way to SQLBulkCopy “insert or update if exists”?

后端 未结 6 2039
青春惊慌失措
青春惊慌失措 2020-11-28 09:10

I need to update a very large table periodically and SQLBulkCopy is perfect for that, only that I have a 2-columns index that prevents duplicates. Is there a way to use SQLB

相关标签:
6条回答
  • 2020-11-28 09:46

    Instead of create a new temporary table, which BTW consume more space and memory.

    I created a Trigger with INSTEAD OF INSERT and use inside MERGE statement.

    But don't forget add the parameter SqlBulkCopyOptions.FireTriggers in the SqlBulkCopy.

    This is my two cents.

    0 讨论(0)
  • 2020-11-28 09:53

    Got a hint from @Ivan. For those who might need, here's what I did.

    create trigger yourschma.Tr_your_triger_name
        on yourschma.yourtable
        instead of INSERT
        as
        merge into yourschma.yourtable as target
        using inserted as source
        on (target.yourtableID = source.yourtableID)
        when matched then
            update
            set target.ID     = source.ID,
                target.some_column = source.some_column,
                target.Amount                       = source.Amount
        when not matched by target then
            insert (some_column, Amount)
            values (source.some_column, source.Amount);
    go
    
    0 讨论(0)
  • 2020-11-28 09:54

    I would bulk load data into a temporary staging table, then do an upsert into the final table. See http://www.databasejournal.com/features/mssql/article.php/3739131/UPSERT-Functionality-in-SQL-Server-2008.htm for an example of doing an upsert.

    0 讨论(0)
  • 2020-11-28 09:56

    Not in one step, but in SQL Server 2008, you could:

    • bulk load into staging table
    • apply a MERGE statement to update/insert into your real table

    Read more about the MERGE statement

    0 讨论(0)
  • 2020-11-28 10:05

    Another alternative would be to not use a temporary table but use a stored procedure with a table valued parameter. Pass a datatable to the sp and do the merge there.

    0 讨论(0)
  • 2020-11-28 10:08

    I published a nuget package (SqlBulkTools) to solve this problem.

    Here's a code example that would achieve a bulk upsert.

    var bulk = new BulkOperations();
    var books = GetBooks();
    
    using (TransactionScope trans = new TransactionScope())
    {
        using (SqlConnection conn = new SqlConnection(ConfigurationManager
        .ConnectionStrings["SqlBulkToolsTest"].ConnectionString))
        {
            bulk.Setup<Book>()
                .ForCollection(books)
                .WithTable("Books")
                .AddAllColumns()
                .BulkInsertOrUpdate()
                .MatchTargetOn(x => x.ISBN)
                .Commit(conn);
        }
    
        trans.Complete();
    }
    

    For very large tables, there are options to add table locks and temporarily disable non-clustered indexes. See SqlBulkTools Documentation for more examples.

    0 讨论(0)
提交回复
热议问题