SqlBulkCopy Not Working

前端 未结 7 2042
予麋鹿
予麋鹿 2021-02-18 17:54

I have a DataSet populated from Excel Sheet. I wanted to use SQLBulk Copy to Insert Records in Lead_Hdr table where LeadId is PK.

相关标签:
7条回答
  • 2021-02-18 18:23

    What I have found is that the columns in the table and the columns in the input must at least match. You can have more columns in the table and the input will still load. If you have less you'll receive the error.

    0 讨论(0)
  • 2021-02-18 18:25

    One of the reason is that :SqlBukCOpy is case sensitive . Follow steps:

    1. In that Case first you have to find your column in Source Table by using "Contain" method in C#.
    2. Once your Destination column matched with source column get index of that column and give its column name in SqlBukCOpy .

    For Example:`

    //Get Column from Source table 
      string sourceTableQuery = "Select top 1 * from sourceTable";
       DataTable dtSource=SQLHelper.SqlHelper.ExecuteDataset(transaction, CommandType.Text, sourceTableQuery).Tables[0];// i use sql helper for executing query you can use corde sw
    
     for (int i = 0; i < destinationTable.Columns.Count; i++)
                            {    //check if destination Column Exists in Source table
                                if (dtSource.Columns.Contains(destinationTable.Columns[i].ToString()))//contain method is not case sensitive
                                {
                                    int sourceColumnIndex = dtSource.Columns.IndexOf(destinationTable.Columns[i].ToString());//Once column matched get its index
                                    bulkCopy.ColumnMappings.Add(dtSource.Columns[sourceColumnIndex].ToString(), dtSource.Columns[sourceColumnIndex].ToString());//give coluns name of source table rather then destination table so that it would avoid case sensitivity
                                }
    
                            }
                            bulkCopy.WriteToServer(destinationTable);
                            bulkCopy.Close();
    
    0 讨论(0)
  • I've encountered the same problem while copying data from access to SQLSERVER 2005 and i found that the column mappings are case sensitive on both data sources regardless of the databases sensitivity.

    0 讨论(0)
  • 2021-02-18 18:32

    Thought a long time about answering... Even if column names are case equally, if the data type differs you get the same error. So check column names and their data type.

    P.S.: staging tables are definitive the way to import.

    0 讨论(0)
  • 2021-02-18 18:36

    Well, is it right? Do the column names exist on both sides?

    To be honest, I've never bothered with mappings. I like to keep things simple - I tend to have a staging table that looks like the input on the server, then I SqlBulkCopy into the staging table, and finally run a stored procedure to move the table from the staging table into the actual table; advantages:

    • no issues with live data corruption if the import fails at any point
    • I can put a transaction just around the SPROC
    • I can have the bcp work without logging, safe in the knowledge that the SPROC will be logged
    • it is simple ;-p (no messing with mappings)

    As a final thought - if you are dealing with bulk data, you can get better throughput using IDataReader (since this is a streaming API, where-as DataTable is a buffered API). For example, I tend to hook CSV imports up using CsvReader as the source for a SqlBulkCopy. Alternatively, I have written shims around XmlReader to present each first-level element as a row in an IDataReader - very fast.

    0 讨论(0)
  • 2021-02-18 18:38

    I would go with the staging idea, however here is my approach to handling the case sensitive nature. Happy to be critiqued on my linq

    using (SqlConnection connection = new SqlConnection(conn_str))
    {
            connection.Open();
            using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
            {
                bulkCopy.DestinationTableName = string.Format("[{0}].[{1}].[{2}]", targetDatabase, targetSchema, targetTable);
                var targetColumsAvailable = GetSchema(conn_str, targetTable).ToArray();
                foreach (var column in dt.Columns)
                {
                    if (targetColumsAvailable.Select(x => x.ToUpper()).Contains(column.ToString().ToUpper()))
                    {
                        var tc = targetColumsAvailable.Single(x => String.Equals(x, column.ToString(), StringComparison.CurrentCultureIgnoreCase));
                        bulkCopy.ColumnMappings.Add(column.ToString(), tc);
                    }
                }
    
                // Write from the source to the destination.
                bulkCopy.WriteToServer(dt);
                bulkCopy.Close();
            }
    }
    

    and the helper method

    private static IEnumerable<string> GetSchema(string connectionString, string tableName)
            {
    
    
    
       using (SqlConnection connection = new SqlConnection(connectionString))
            using (SqlCommand command = connection.CreateCommand())
            {
                command.CommandText = "sp_Columns";
                command.CommandType = CommandType.StoredProcedure;
    
                command.Parameters.Add("@table_name", SqlDbType.NVarChar, 384).Value = tableName;
    
                connection.Open();
                using (var reader = command.ExecuteReader())
                {
                    while (reader.Read())
                    {
                        yield return (string)reader["column_name"];
                    }
                }
            }
        }
    
    0 讨论(0)
提交回复
热议问题