I have a list of objects, this list contains about 4 million objects. there is a stored proc that takes objects attributes as params , make some lookups and insert them into tab
It's never going to be ideal to insert four million records from C#, but a better way to do it is to build the command text up in code so you can do it in chunks.
This is hardly bulletproof, and it doesn't illustrate how to incorporate lookups (as you've mentioned you need), but the basic idea is:
// You'd modify this to chunk it out - only testing can tell you the right
// number - perhaps 100 at a time.
for(int i=0; i < items.length; i++) {
// e.g., 'insert dbo.Customer values(@firstName1, @lastName1)'
string newStatement = string.Format(
"insert dbo.Customer values(@firstName{0}, @lastName{0})", i);
command.CommandText += newStatement;
command.Parameters.Add("@firstName" + i, items[i].FirstName);
command.Parameters.Add("@lastName" + i, items[i].LastName);
}
// ...
command.ExecuteNonQuery();
I have had excellent results using XML to get large amounts of data into SQL Server. Like you, I initially was inserting rows one at a time which took forever due to the round trip time between the application and the server, then I switched the logic to pass in an XML string containing all the rows to insert. Time to insert went from 30 minutes to less that 5 seconds. This was for a couple of thousand rows. I have tested with XML strings up to 20 megabytes in size and there were no issues. Depending on your row size this might be an option.
The data was passed in as an XML String using the nText type.
Something like this formed the basic details of the stored procedure that did the work:
CREATE PROCEDURE XMLInsertPr( @XmlString ntext )
DECLARE @ReturnStatus int, @hdoc int
EXEC @ReturnStatus = sp_xml_preparedocument @hdoc OUTPUT, @XmlString
IF (@ReturnStatus <> 0)
BEGIN
RAISERROR ('Unable to open XML document', 16,1,50003)
RETURN @ReturnStatus
END
INSERT INTO TableName
SELECT * FROM OPENXML(@hdoc, '/XMLData/Data') WITH TableName
END
Take a look at the SqlBulkCopy Class
based on your comment, dump the data into a staging table then do the lookup and insert into the real table set based from a proc....it will be much faster than row by row
If this is a onetime operation: do no optimize and run it during the night / weekend
You might consider dropping any indexes you have on the table(s) you are inserting into and then recreating them after you have inserted everything. I'm not sure how the bulk copy class works but if you are updating your indexes on every insert it can slow things down quite a bit.