Table Valued Parameter: sending data in small chunks

前端 未结 1 809
时光说笑
时光说笑 2021-01-26 17:09

I am reading from a csv file and sending data as table variable to a stored procedure. From what i have tested so far , I am able to process 300k records in 3 mins 30 seconds .

1条回答
  •  醉话见心
    2021-01-26 17:34

    An example of using IEnumerable SqlDataRecord
    It works kind of like a reverse datareader

    Notice I sort. This is by the clustered index. Fragmentation of the indexes will absolutely kill load speed. The first implementation used Insert Values (unsorted) and in a 12 hour run this version is literally 100x faster. I also disable indexes other than the PK and reindex at the end of the load. In a long run I am getting about 500 rows / second. Your sample is 1400 / second so great. If you start to see degradation then things to look at.

    public class DocFTSinXsCollection : List, IEnumerable
    {
        // used by TVP for fast insert
        private int sID;
        private IEnumerable docFTSinXs;
        IEnumerator IEnumerable.GetEnumerator()
        {
            //todo fix the order in 3 to sID, wordID1, workID2
            var sdr = new SqlDataRecord(
            new SqlMetaData("wordID1", System.Data.SqlDbType.Int),
            new SqlMetaData("wordID2", System.Data.SqlDbType.Int),
            new SqlMetaData("sID", System.Data.SqlDbType.Int),
            new SqlMetaData("Delta", System.Data.SqlDbType.Int));
            foreach (DocFTSinX oh in docFTSinXs.OrderBy(x => x.Word1).ThenBy(x => x.Word2))
            {
                sdr.SetInt32(0, oh.Word1);
                sdr.SetInt32(1, oh.Word2);
                sdr.SetInt32(2, sID);
                sdr.SetInt32(3, (Int32)oh.Delta);
                yield return sdr;
            }
        }
    
        public DocFTSinXsCollection(int SID, IEnumerable DocFTSinXs)
        {
            sID = SID;
            docFTSinXs = DocFTSinXs;
            //Debug.WriteLine("DocFTSinXsCollection DocFTSinXs " + DocFTSinXs.Count().ToString());
        }
    }
    

    Other tools to consider are the SQLBulkCopy .NET class and Drapper.

    OP asked how to perform in batches.

     while (true)
     {
         // if no more break;
         // fill list or datatable with next 100000
         // send list or datatable to db
     }
    

    0 讨论(0)
提交回复
热议问题