sqlbulkcopy

How to use SQL Bulk Copy with Dapper .Net ?

风格不统一 提交于 2019-12-07 00:45:41
I am working with Dapper .net for Bulk insert operation in SQL Tables. I am thinking to user SQKBulk copy with Dapper .Net but don't have any experience How to use SqlbulkCopy with Dapper .Net your help is Highly appreciated Alex Erygin It is not good idea to use dapper for bulk insert, because there it will not fast. The better case for this is use of SqlBulkCopy class. But if you want use Dapper for bulk insert, you can find solution here . 来源: https://stackoverflow.com/questions/29070108/how-to-use-sql-bulk-copy-with-dapper-net

Is there a faster way to use SqlBulkCopy than using a DataTable?

最后都变了- 提交于 2019-12-06 15:29:21
I load a large amount of records into my application (1 million+) and do a ton of processing on them. The processing requires them all to be in the memory. Afterwards, I want to dump all of the (now modified) records into an empty table. Loading the records takes mere seconds, and I end up with a large array of MyRecord items. Saving using SqlBulkCopy takes mere seconds as well. However SqlBulkCopy requires (I believe) a DataTable - and loading my records into a DataTable is slow - approximately 7500 records per minute using dataTable.Rows.Add(myRecord.Name, myRecord.Age, ....) Is there a

Supplying stream as a source of data for a binary column when SqlBulkCopy is used

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-06 09:36:10
If one needs to read data from SqlServer in a streamed fashion, there are some capabilities for that. Such as using SqlDataReader with CommandBehavior.SequentialAccess , and particularly when binary column data needs to be accessed there is the GetStream(int) method for that: var cmd = new SqlCommand(); cmd.Connection = connection; cmd.CommandText = @"select 0x0123456789 as Data"; using (var dr = cmd.ExecuteReader(CommandBehavior.SequentialAccess)) { dr.Read(); var stream = dr.GetStream(0); // access stream } But what about streaming data in the opposite direction, when one needs to feed data

Using SQLBulkCopy - Significantly larger tables in SQL Server 2016 than in SQL Server 2014

☆樱花仙子☆ 提交于 2019-12-06 09:18:34
I have an application that uses SqlBulkCopy to move data into a set of tables. It has transpired recently that users that are using SQL2016 are reporting problems with their harddrives being filled with very large databases (that should not be that large). This problem does not occur in SQL2014. Upon inspection it appears that running TableDataSizes.sql (script attached) showed large amounts of space in UnusedSpaceKB. I would like to know if a) There is some bug in SQLServer 2016 or if our use of SQLBulkCopy has "clashed" with a new feature. I note that there has been some changes to Page

OracleBulkCopy Memory Leak(OutOfMemory Exception)

China☆狼群 提交于 2019-12-06 08:22:04
问题 Below is the code I used to bulkcopy data from a temp table dataTable into a destTable in Oracle Database. The dataTable has about 2 million records. using (OracleBulkCopy bulkCopy = new OracleBulkCopy(VMSDATAConnectionString)) { try { foreach (OracleBulkCopyColumnMapping columnMapping in columnMappings) bulkCopy.ColumnMappings.Add(columnMapping); bulkCopy.DestinationTableName = destTableName; //bulkCopy.BatchSize = dataTable.Rows.Count; //bulkCopy.BulkCopyTimeout = 100; int defaultSize =

How to keep row order with SqlBulkCopy?

馋奶兔 提交于 2019-12-06 02:50:24
I'm exporting data programatically from Excel to SQL Server 2005 using SqlBulkCopy. It works great, the only problem I have is that it doesn't preserve the row sequence i have in Excel file. I don't have a column to order by, I just want the records to be inserted in the same order they appear in the Excel Spreadsheet. I can't modify the Excel file, and have to work with what I've got. Sorting by any of the existing columns will break the sequence. Please help. P.S. Ended up inserting ID column to the spreadsheet, looks like there's no way to keep the order during export/import I don't think

postgresql: how to get primary keys of rows inserted with a bulk copy_from?

我与影子孤独终老i 提交于 2019-12-05 20:44:28
The goal is this: I have a set of values to go into table A , and a set of values to go into table B . The values going into B reference values in A (via a foreign key), so after inserting the A values I need to know how to reference them when inserting the B values. I need this to be as fast as possible. I made the B values insert with a bulk copy from: def bulk_insert_copyfrom(cursor, table_name, field_names, values): if not values: return print "bulk copy from prepare..." str_vals = "\n".join("\t".join(adapt(val).getquoted() for val in cur_vals) for cur_vals in values) strf = StringIO(str

SqlBulkCopy insert order

自作多情 提交于 2019-12-05 19:34:05
To copy data from one database to another in different server with same schema, I am planning to use SqlBulkCopy class from C sharp library. Whether SqlBulkCopy will maintain the same order as it is in the datatable while inserting the records ? Example: id is the identity column. Server1, db1 TableA id name 1 name10 2 name20 3 name30 4 name40 Server2, db1 TableA id name 1 name1 2 name2 3 name3 4 name4 .......... .......... 5000 name22 5001 name33 Step1: var dt = select * from server1.dbo.TableA order by id; Step2: SQL bulk copy into server2 bulkCopy.WriteToServer(dt); Step3: var resultDt =

F# DataTable to SQL using SqlBulkCopy

北城余情 提交于 2019-12-05 17:00:21
I have an F# program that creates a DataTable , populates it with one row and then writes the data to SQL Server using bulk insert ( SqlBulkCopy ). Although it's working, I can't really figure out how to include a loop that will generate a number of list items / data rows which I can then insert in one statement, rather than having to bulk insert a single row at a time (which is the current case) here's my code: open System open System.Data open System.Data.SqlClient let lcpSqlConnection = new SqlConnection("<my-connection-string>") lcpSqlConnection.Open() let bulkLoadEsgData (conn

SqlBulkCopy and DataTables with Parent/Child Relation on Identity Column

和自甴很熟 提交于 2019-12-05 11:59:20
问题 We have a need to update several tables that have parent/child relationships based on an Identity primary-key in the parent table, which is referred to by one or more child tables as a foreign key. Due to the high volume of data, we would like to build these tables in memory, then use SqlBulkCopy from C# to update the database en mass from either the DataSet or the individual DataTables. We would further like to do this in parallel, from multiple threads, processes, and possibly clients. Our