问题
I have a table that has 200K+ rows, that I need to update on daily basis. My console app generates all this data on a daily run and needs to update the table with the results.
The situation is that I'm operating on a slow and network, and at each run there is less than 0.1% rows to update/insert/delete, so clearly there's space to optimize. The table is simple - key column, plus 2 nvarchar columns.
So my question is - what's the best approach in this particular case? I could always plow it over and do a SQLBulkCopy, but would SqlDataAdapter be more efficient?
Thx,
Mike
回答1:
Ship XML with all the changes to a stored procedure.
One trip to the database.
Here is an old example here:
http://granadacoder.wordpress.com/2009/01/27/bulk-insert-example-using-an-idatareader-to-strong-dataset-to-sql-server-xml/
Here is a smaller example, but shows the basics.
http://www.mindfiresolutions.com/Sending-Multiple-Records-As-XML-To-SQL-Server-Stored-Procedure-1861.php
Send xml to stored procedure. Shred the xml to a @variable or #temp table. Do your UPDATES / INSERTS (or MERGE/UPSERT) using the @variable or #temp table.
EDIT:
http://weblogs.asp.net/dwahlin/archive/2009/09/30/passing-multiple-records-to-a-stored-procedure-in-sql-server.aspx
Another example.
What I like to do is create a strong dataset. Put your data into the strong dataset. Then send the ds.GetXml() to the stored procedure.
That way, you get strong typing (using the strong dataset), and you don't have to write your own xml-maker, you piggy back off of .GetXml(). Hint: After creating the strong dataset, remove the namespace (tempuri or something like that)
APPEND ( Spring, 2019 )
I no longer put items in a strong dataset (or any dataset) because of xml "bloat".
I write a custom PocoObject-To-Xml converter (that is xml-attribute based) and pass that down to the stored procedure.
Below is element based...and shows the xml-bloat
<Employees>
<Employee>
<EmployeeKey>123</EmployeeKey>
<LastName>Smith</LastName>
<FirstName>John</FirstName>
</Employee>
<Employee>
<EmployeeKey>234</EmployeeKey>
<LastName>Jones</LastName>
<FirstName>Mary</FirstName>
</Employee>
</Employees>
vs (more trimmed down)
<Employees>
<Employee EmployeeKey="123" LastName="Smith" FirstName="John" />
<Employee EmployeeKey="234" LastName="Jones" FirstName="Mary" />
</Employees>
And the custom converter code below:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Xml;
namespace MyNamespace
{
public class EmployeesToXmlConverter
{
public string ConvertToXmlString(ICollection<EmployeePoco> emps)
{
StringBuilder sb = new StringBuilder();
XmlWriter writer = XmlWriter.Create(sb);
writer.WriteStartElement("root");
if (null != emps && emps.Any())
{
writer.WriteStartElement("Employees");
foreach (EmployeePoco emp in emps)
{
writer.WriteStartElement("Employee");
writer.WriteAttributeString("EmployeeKey", Convert.ToString(emp.EmployeeKey));
writer.WriteAttributeString("LastName", emp.LastName);
writer.WriteAttributeString("FirstName", emp.FirstName);
writer.WriteEndElement(); ////closing patient tag
}
writer.WriteEndElement(); ////closing emps tag
}
writer.WriteEndElement(); ////closing root tag
writer.Close();
string returnValue = sb.ToString();
return returnValue;
}
}
}
来源:https://stackoverflow.com/questions/17176898/c-sharp-best-approach-for-a-bulk-update-merge-on-a-table-with-high-data-volumn