sqlbulkcopy

How can I set column type when using SqlBulkCopy to insert into a sql_variant column

南笙酒味 提交于 2019-12-10 19:18:09
问题 I'm using SqlBulkCopy to insert/update from a .net DataTable object to a SQL Server table that includes a sql_variant column. However SqlBulkCopy insists on storing DateTime values put into that column as sql type 'datetime' when what I need is 'datetime2'. My DataTable is defined like this: DataTable dataTable = new DataTable(); dataTable.Columns.Add(new DataColumn("VariantValue", typeof(object))); //this represents my sql_variant column Then I throw some data in there that requires a

SQL Bulk Copy “The given value of type String from the data source cannot be converted to type datetime of the specified target column” using ASP.NET

放肆的年华 提交于 2019-12-10 14:54:57
问题 I'm working on an ASP.NET MVC4 projet and I'm trying to export data from an xlsx file (Excel 2010 file) to my database by using SQL Bulk Copy. My Excel file contains only 2 columns : the first contains numbers (from 1 to 25) and the second contains characters (successive series of "a, b, c") This is how I try to do in order to export data but I got the error "The given value of type String from the data source cannot be converted to type int of the specified target column" : public

SQLBulkCopy with Identity Insert in destination table

天大地大妈咪最大 提交于 2019-12-10 14:30:46
问题 I am trying to insert a Generic list to SQL Server with SQLBulkCopy, And i have trouble wit Identity Field I wan t my destination table to generate identity field How should i handle this, here is my code using (var bulkCopy = new SqlBulkCopy(ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString)) { bulkCopy.BatchSize = (int)DetailLines; bulkCopy.DestinationTableName = "dbo.tMyTable"; var table = new DataTable(); var props = TypeDescriptor.GetProperties(typeof

Is there a way to use SqlBulkCopy without converting the data to a DataTable?

我是研究僧i 提交于 2019-12-10 12:47:52
问题 Is there a way to use SqlBulkCopy without converting the data to a DataTable? I have a list of objects (List) in RAM and I really don't want to use more memory to create the DataTable. Could it be possible to implement IDataReader on a List? Thanks! 回答1: I would certainly imagine that you could. BulkDataReader requires schema information; that's why you can't simply provide a List . If you design a class that implements IDataReader , you'll be providing this in your GetSchemaTable

F# DataTable to SQL using SqlBulkCopy

社会主义新天地 提交于 2019-12-10 09:32:43
问题 I have an F# program that creates a DataTable , populates it with one row and then writes the data to SQL Server using bulk insert ( SqlBulkCopy ). Although it's working, I can't really figure out how to include a loop that will generate a number of list items / data rows which I can then insert in one statement, rather than having to bulk insert a single row at a time (which is the current case) here's my code: open System open System.Data open System.Data.SqlClient let lcpSqlConnection =

C# Optimisation: Inserting 200 million rows into database

我的未来我决定 提交于 2019-12-09 06:31:51
问题 I have the following (simplified) code which I'd like to optimise for speed: long inputLen = 50000000; // 50 million DataTable dataTable = new DataTable(); DataRow dataRow; object[] objectRow; while (inputLen--) { objectRow[0] = ... objectRow[1] = ... objectRow[2] = ... // Generate output for this input output = ... for (int i = 0; i < outputLen; i++) // outputLen can range from 1 to 20,000 { objectRow[3] = output[i]; dataRow = dataTable.NewRow(); dataRow.ItemArray = objectRow; dataTable.Rows

SqlBulkCopy DataTable with WellKnownText spatial data column

家住魔仙堡 提交于 2019-12-08 17:12:09
问题 I'm trying to bulk copy a DataTable which has the following columns: "ID" - System.Int32 "Geom" - System.String Into a SQL database with the following columns: "Id" - int "Shape" - geometry Can anyone advise on the best way to do this? Some test code if it helps... DataTable dataTable = new DataTable(); dataTable.Columns.Add("ID", typeof(Int32)); dataTable.Columns.Add("Geom", typeof(String)); dataTable.Rows.Add(1, "POINT('20,20')"); dataTable.Rows.Add(1, "POINT('40,25')"); dataTable.Rows.Add

Is there a faster way to use SqlBulkCopy than using a DataTable?

社会主义新天地 提交于 2019-12-08 08:39:09
问题 I load a large amount of records into my application (1 million+) and do a ton of processing on them. The processing requires them all to be in the memory. Afterwards, I want to dump all of the (now modified) records into an empty table. Loading the records takes mere seconds, and I end up with a large array of MyRecord items. Saving using SqlBulkCopy takes mere seconds as well. However SqlBulkCopy requires (I believe) a DataTable - and loading my records into a DataTable is slow -

Error: while Importing data into Redshift

只谈情不闲聊 提交于 2019-12-08 05:20:42
问题 I wanted to unload from one database(production) and reload into another database(QA) in Redshift having exact same schema. I issued S3 load command as following. copy table(col1,col2,col3,col4) from 's3://<bucket_path>/<file_name>.gzip' CREDENTIALS 'aws_access_key_id=<your_key>;aws_secret_access_key=<your_secret>' delimiter '|' gzip NULL AS 'null_string'; Got following error. ERROR: Failed writing body (0 != XXX) Cause: Failed to inflateinvalid or incomplete deflate data. zlib error code: -3

How to use SQL Bulk Copy with Dapper .Net ?

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-08 04:51:00
问题 I am working with Dapper .net for Bulk insert operation in SQL Tables. I am thinking to user SQKBulk copy with Dapper .Net but don't have any experience How to use SqlbulkCopy with Dapper .Net your help is Highly appreciated 回答1: It is not good idea to use dapper for bulk insert, because there it will not fast. The better case for this is use of SqlBulkCopy class. But if you want use Dapper for bulk insert, you can find solution here. 来源: https://stackoverflow.com/questions/29070108/how-to