Bulk data insertion in SQL Server table from delimited text file using c#

半世苍凉 提交于 2019-12-23 13:06:41

问题


I have tab delimited text file. File is around 100MB. I want to store data from this file to SQL server table. The file contains 1 million records when stored in sql server. What is the best way to achieve this?

I can create in momory datatable in c# and then upload the same to sql server, but in this case it will load entire 100 MB file to memory. What if file size get bigger?


回答1:


No problem; CsvReader will handle most delimited text formats, and implements IDataReader, so can be used to feed a SqlBulkCopy. For example:

using (var file = new StreamReader(path))
using (var csv = new CsvReader(file, true)) // true = first row is headers
using (var bcp = new SqlBulkCopy(connectionString))
{
    bcp.DestinationTableName = "Foo";
    bcp.WriteToServer(csv);
}

Note that CsvReader has lots of options more more subtle file handling (specifying the delimiter rules, etc). SqlBulkCopy is the high-performance bulk-load API - very efficient. This is a streaming reader/writer API; it does not load all the data into memory at once.




回答2:


You should read the file line-by-line, so you don't have to load the whole line into memory:

using (var file = System.IO.File.OpenText(filename))
{
    while (!file.EndOfStream)
    {
        string line = file.ReadLine();

        // TODO: Do your INSERT here
    }
}

* Update *

"This will make 1 million separate insert commands to sql server. Is there any way to make it in bulk"

You could use parameterised queries, which would still issue 1M inserts, but would still be quite fast.

Alternatively, you can use SqlBulkCopy, but that's going to be rather difficult if you don't want to use 3rd party libraries. If you are more amenable to the MS license, you could use the LINQ Entity Data Reader (distributed under Ms-PL license), which provides the AsDataReader extension method:

void MyInsertMethod()
{
    using (var bulk = new SqlBulkCopy("MyConnectionString"))
    {
        bulk.DestinationTableName = "MyTableName";
        bulk.WriteToServer(GetRows().AsDataReader());
    }
}

class MyType
{
    public string A { get; set; }
    public string B { get; set; }
}

IEnumerable<MyType> GetRows()
{
    using (var file = System.IO.File.OpenText("MyTextFile"))
    {
        while (!file.EndOfStream)
        {
            var splitLine = file.ReadLine().Split(',');

            yield return new MyType() { A = splitLine[0], B = splitLine[1] };
        }
    }
}

If you didn't want to use the MS licensed code either, you could implement IDataReader yourself, but that is going to be a PITA. Note that the CSV handling above (Split(',')) is not at all robust, and also that column names in the table must be the same as property names on MyType. TBH, I'd recommend you go with Marc's answer on this one



来源:https://stackoverflow.com/questions/11840522/bulk-data-insertion-in-sql-server-table-from-delimited-text-file-using-c-sharp

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!