Any Best practice of doing table record insert with SQL CLR store procedure?

眉间皱痕 提交于 2019-12-03 22:47:32

I ran across this while working on an SQLite project a few months back and found it enlightening. I think it might be what you're looking for.

...

Fastest universal way to insert data using standard ADO.NET constructs

Now that the slow stuff is out of the way, lets talk about some hardcore bulk loading. Aside from SqlBulkCopy and specialized constructs involving ISAM or custom bulk insert classes from other providers, there is simply no beating the raw power of ExecuteNonQuery() on a parameterized INSERT statement. I will demonstrate:

internal static void FastInsertMany(DbConnection cnn)
{

    using (DbTransaction dbTrans = cnn.BeginTransaction())
    {

        using (DbCommand cmd = cnn.CreateCommand())
        {

            cmd.CommandText = "INSERT INTO TestCase(MyValue) VALUES(?)";

            DbParameter Field1 = cmd.CreateParameter();

            cmd.Parameters.Add(Field1);

            for (int n = 0; n < 100000; n++)
            {

                Field1.Value = n + 100000;

                cmd.ExecuteNonQuery();

            }

        }

        dbTrans.Commit();

    }

}

You could return a table with 2 columns (COLLECTION_NAME nvarchar(max), CONTENT xml) filled with as many rows as internal collections you have. CONTENT will be an XML representation of the data in the collection.

Then you can use the XML features of SQL 2005/2008 to parse each collection's XML into tables, and perform your INSERT INTO's or MERGE statements on the whole table.

That should be faster than individual INSERTS inside your C# code.

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!