sqlbulkcopy

Possible to get PrimaryKey IDs back after a SQL BulkCopy?

╄→尐↘猪︶ㄣ 提交于 2019-12-17 18:34:57
问题 I am using C# and using SqlBulkCopy. I have a problem though. I need to do a mass insert into one table then another mass insert into another table. These 2 have a PK/FK relationship. Table A Field1 -PK auto incrementing (easy to do SqlBulkCopy as straight forward) Table B Field1 -PK/FK - This field makes the relationship and is also the PK of this table. It is not auto incrementing and needs to have the same id as to the row in Table A. So these tables have a one to one relationship but I am

SqlBulkCopy and Entity Framework

时光总嘲笑我的痴心妄想 提交于 2019-12-17 15:44:42
问题 My current project consists of 3 standard layers: data, business, and presentation. I would like to use data entities for all my data access needs. Part of the functionality of the app will that it will need to copy all data within a flat file into a database. The file is not so big so I can use SqlBulkCopy. I have found several articles regarding the usage of SqlBulkCopy class in .NET. However, all the articles are using DataTables to move data back and forth. Is there a way to use data

Get an IDataReader from a typed List

我是研究僧i 提交于 2019-12-17 04:53:31
问题 I have a List<MyObject> with a million elements. (It is actually a SubSonic Collection but it is not loaded from the database). I'm currently using SqlBulkCopy as follows: private string FastInsertCollection(string tableName, DataTable tableData) { string sqlConn = ConfigurationManager.ConnectionStrings[SubSonicConfig.DefaultDataProvider.ConnectionStringName].ConnectionString; using (SqlBulkCopy s = new SqlBulkCopy(sqlConn, SqlBulkCopyOptions.TableLock)) { s.DestinationTableName = tableName;

Bulk insert strategy from c# to SQL Server

此生再无相见时 提交于 2019-12-14 03:41:51
问题 In our current project, customers will send collection of a complex/nested messages to our system. Frequency of these messages are approx. 1000-2000 msg/per seconds. These complex objects contains the transaction data (to be added) as well as master data (which will be added if not found). But instead of passing the ids of the master data, customer passes the 'name' column. System checks if master data exist for these names. If found, it uses the ids from database otherwise create this master

Inserting Bulk Data in SQL: OLEDB IRowsetFastLoad vs. Ado.Net SqlBulkCopy

旧巷老猫 提交于 2019-12-14 02:26:08
问题 I am evaluating different methods for inserting large amount of data in SQL server. I've found SqlBulkCopy class from Ado.Net and IRowsetFastLoad interface from OLEDB. As far as I know, IRowsetFastLoad doesn't map to C#, which is my base platform, so I am evaluating if it would be worth it to create a wrapper around IRowsetFastLoad for .net, so I can use it on my application. Anyone knows if IRowsetFastLoad would actually perform better than SqlBulkInsert -- Would it be worthy to create such

Skip some columns in SqlBulkCopy

做~自己de王妃 提交于 2019-12-13 11:42:12
问题 I'm using SqlBulkCopy against two SQL Server 2008 with different sets of columns (going to move some data from prod server to dev ). So want to skip some columns not yet existed / not yet removed. How can I do that? Some trick with ColumnMappings ? Edit: I do next: DataTable table = new DataTable(); using (var adapter = new SqlDataAdapter(sourceCommand)) { adapter.Fill(table); } table.Columns .OfType<DataColumn>() .ForEach(c => bulk.ColumnMappings.Add( new SqlBulkCopyColumnMapping(c

Can't insert data table using sqlbulkcopy

筅森魡賤 提交于 2019-12-13 11:35:44
问题 This is my code with the following columns and in the DB, those columns are nvarchars . SqlBulkCopy bulkCopy = new SqlBulkCopy(connection, System.Data.SqlClient.SqlBulkCopyOptions.Default, transaction); bulkCopy.DestinationTableName = "Test"; bulkCopy.ColumnMappings.Add("Number", "Code"); bulkCopy.ColumnMappings.Add("Type", "Type"); bulkCopy.ColumnMappings.Add("Group", "Group"); bulkCopy.ColumnMappings.Add("Short Text", "ShortText"); bulkCopy.ColumnMappings.Add("Text", "Description");

Date Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM OverFlow error SqlBulkCopy [duplicate]

本秂侑毒 提交于 2019-12-13 06:40:35
问题 This question already has answers here : Error - SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM (12 answers) Closed 4 years ago . I am reading data from an access db and storing it in a temporary sql table then truncate the main sql table and insert the fresh data set and i am accomplishing that task using the code below but the datetime is giving me issues: Console.WriteLine("NetWeightTracking-Abilene Started"); var du = new System.Data.DataTable();

COPY_FROM throws .read() call error on bulk data on psycopg2

寵の児 提交于 2019-12-13 03:09:20
问题 I am trying to port a mongodb over to a postgres. I am using COPY_FROM to insert bulk data faster, but I keep getting the same error: psycopg2.extensions.QueryCanceledError: COPY from stdin failed: error in .read() call My original code would try to populate the table in one go but if I split the data into batches, there's a threshold to which the error isn't thrown, which is around 3 million records per batch: # Base case (only 1 iteration) : throws .read() call error MAX_BATCH_SIZE = total

BCP Import error “Invalid character value for cast specification”

左心房为你撑大大i 提交于 2019-12-12 12:37:36
问题 All I am using BCP for import export and getting "Invalid character value for cast specification" error for only 1(first row of export) row while trying to import back. Table Structure Col1 -- Numeric(19,0) Col2 -- NVARCHAR(400) Col3 -- NVARCHAR(400) I am using following commands FOR Export EXEC master..xp_cmdshell 'bcp "SELECT TOP 10 Col1, Col2, Col3 FROM Server.dbo.TableName" queryout C:\Data\File.dat -S Server -T -t"<EOFD>" -r"<EORD>" -w' Same way I am generating a FORMAT file EXEC master.