问题
I am uploading data from CSV to SQL table using the following function.
Is there a better way to do it?
I am concerned about , right now, connection hold for long time. Hence need to be reduced.
public bool SaveProxyBulkUploadData(List<AddServerPError> saveBulkUploadData)
{
try
{
foreach (AddServerPError addServetData in saveBulkUploadData)
{
DbCommand dbCmd = CitiScriptExecutionDB.GetStoredProcCommand("USP_HS_InsertProxyBulkUploadData");
CitiScriptExecutionDB.AddInParameter(dbCmd, "@groupid", DbType.String, addServetData.GroupId);
CitiScriptExecutionDB.AddInParameter(dbCmd, "@proxyname", DbType.String, addServetData.ProxyName);
CitiScriptExecutionDB.AddInParameter(dbCmd, "@proxytype", DbType.String, addServetData.ProxyType);
CitiScriptExecutionDB.AddInParameter(dbCmd, "@sa_spoc_dl", DbType.String, addServetData.SA_SPOC_DL);
CitiScriptExecutionDB.AddInParameter(dbCmd, "@assignmentgroup", DbType.String, addServetData.AssignmentGroup);
CitiScriptExecutionDB.AddInParameter(dbCmd, "@createdby", DbType.String, addServetData.CreatedBy);
CitiScriptExecutionDB.ExecuteNonQuery(dbCmd);
}
return true;
}
catch (Exception ex)
{
string msg = ex.Message + ex.StackTrace;
throw;
}
}
Stored Procedure
CREATE PROCEDURE USP_HS_InsertBulkUploadData
@groupid int,
@hostname varchar(50),
@type1 varchar(50),
@type2 varchar(50),
@createdby varchar(50)
AS
BEGIN
Insert into [dbo].[EUCUsecaseGroupInputParam]
(
GroupId,
HostName,
Type1,
Type2,
CreatedBy,
CreatedDate,
UpdatedBy,
UpdatedDate,
IsActive
)
Values(
@groupId,
@hostname,
@type1,
@type2,
@createdby,
GetDate(),
null,
null,
1
)
END
回答1:
Your best bet here is probably SqlBulkCopy
, which throws raw TDS at the server very efficiently. SqlBulkCopy
takes two types of input:
DataTable
IDataReader
So at that point you have 3 options:
- transform your
List<AddServerPError>
to aDataTable
manually - use something like
FastMember
to obtain anIDataReader
over your existing list (an example of this is at the bottom of the project page: https://github.com/mgravell/fast-member) - on nuget: https://www.nuget.org/packages/FastMember/ - ditch your existing list, and read the CSV directly as an
IDataReader
- there is a "LumenWorks"CsvReader
that has always worked well for that; the closest nuget link I can see is https://www.nuget.org/packages/LumenWorksCsvReader/ (although this isn't the original, it has been extended)
回答2:
You can consider using Cinchoo ETL, an open source library to do the bulk upload CSV file to database.
Option 1:
Load the CSV file straight to database
string connectionstring = @"#YOUR DB ConnectionString#";
using (SqlBulkCopy bcp = new SqlBulkCopy(connectionstring))
{
using (var p = new ChoCSVReader("#YOUR CSV FILE#"))
{
bcp.DestinationTableName = "#TABLENAME#";
bcp.EnableStreaming = true;
bcp.BatchSize = 10000;
bcp.BulkCopyTimeout = 0;
bcp.NotifyAfter = 100;
bcp.SqlRowsCopied += delegate (object sender, SqlRowsCopiedEventArgs e)
{
Console.WriteLine(e.RowsCopied.ToString("#,##0") + " rows copied.");
};
bcp.WriteToServer(p.AsDataReader());
}
}
Option 2:
If the load of CSV done already and outputted as List<AddServerPError>
, you still can upload them to database as below
List<AddServerPError> objs = # Your input objects #;
string connectionstring = @"#YOUR DB ConnectionString#";
using (SqlBulkCopy bcp = new SqlBulkCopy(connectionstring))
{
bcp.DestinationTableName = "#TABLENAME#";
bcp.EnableStreaming = true;
bcp.BatchSize = 10000;
bcp.BulkCopyTimeout = 0;
bcp.NotifyAfter = 100;
bcp.SqlRowsCopied += delegate (object sender, SqlRowsCopiedEventArgs e)
{
Console.WriteLine(e.RowsCopied.ToString("#,##0") + " rows copied.");
};
bcp.WriteToServer(objs.AsDataReader());
}
来源:https://stackoverflow.com/questions/51417547/bulk-upload-strategy-for-sql-server