bulkinsert

Will all inserts performed when using LOAD DATA INFILE be rolled back if it fails?

时光总嘲笑我的痴心妄想 提交于 2019-12-13 01:14:26
问题 I use LOAD DATA INFILE from C# using the MySQLBulkLoader object of the MySQL Connector for .NET, to load 24 GB of data into a MySQL 5.5 table. I am wondering whether in case of failure (for any reason, even warnings), the bulk insertion would be properly rolled back as if it had been done within a transaction, or if I should expect to see the first successful records already commited. The relevant page on the MySQL manual is not very informative in this regard. 回答1: MySQL in auto-commit mode

BULK INSERT / OPENROWSET FormatFile Terminator for CSV file with , (comma) in the data

…衆ロ難τιáo~ 提交于 2019-12-13 00:18:57
问题 I've written a nice import for my million row CSV that works quite nicely (using OPENROWSET BULK (I didn't use BULK INSERT because I need to cross join with some other columns). The formatfile uses a comma as the Terminator. Here is an example of the CSV I was using to develop: Reference, Name, Street 1,Dave Smith, 1 Test Street 2,Sally SMith,1 Test Street Once I'd got it working, someone reminded me that the data itself could have a comma in it, whoops!!!! Reference, Name, Street "1","Dave

Are bulk inserts atomic in MongoDB

谁说胖子不能爱 提交于 2019-12-12 21:26:43
问题 I am learning about mongodb. If I create a bulk write is this transaction all or nothing? I have a scenario where my users can delete who they are friends with. FRIEND 1 | FRIEND 2 User B USER A User A USER B For this to happen I need to delete from both bidirectional relationships. For consistency I need these to occur as a all or nothing because I wouldn't want only 1 of the 2 operations to succeed as this would cause bad data. Reading the docs I could not find the answer: https://docs

How to modify data in .csv during BULK INSERT?

…衆ロ難τιáo~ 提交于 2019-12-12 19:36:04
问题 I'm trying to convert a web application I built using MySQL into Microsoft SQL and need some guidance. I've got a variety of different sources of CSV data and I was using a LOAD DATA LOCAL INFILE to modify the contents (e.g. change case to uppercase, remove whitespace, concatenate several fields into one, etc), add some data (the account number and current date/time), ignore some data (assign to a dummy variable and never use), and put the data into my database in the correct columns. Can I

Bulk insert not working for NULL data

给你一囗甜甜゛ 提交于 2019-12-12 19:17:41
问题 When I am inserting bulk data to a table from a CSV file, it is not working, showing error lie : Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 9 Column 9 value in csv file is null.. How do I take care of this? 回答1: From this amount of information I'd say the target table's particular field is defined as "NOT NULL". To workaround the issue you have to: a) modify csv-->add value to field(s) where they have null b) modify target

Bulk insert with Spring Boot and Spring Data JPA not working

百般思念 提交于 2019-12-12 13:17:29
问题 I know that there are many similar questions about this argument, but I really need a working solution. I'm trying to configure Spring Boot and Spring Data JPA in order to make bulk insert in a batch. The target is: commit each N-records , not every single record when making repository.save() action. What I've tried since now in the application.properties : spring.jpa.properties.hibernate.jdbc.batch_size=100 spring.jpa.properties.hibernate.order_inserts=true spring.jpa.properties.hibernate

Insert a empty string on SQL Server with BULK INSERT

試著忘記壹切 提交于 2019-12-12 11:24:21
问题 Example table contains the fields Id (the Identity of the table, an integer); Name (a simple attribute that allows null values, it's a string) I'm trying a CSV that contains this: 1, 1,"" 1,'' None of them gives me a empty string as the result of the bulk insertion. I'm using SQL Server 2012. What can I do? 回答1: As far as I know, bulk insert can't insert empty string, it can either keep null value or use default value with keepnulls option or without keepnulls option. For your 3 sample

What is the difference between inserting data using Sql insert statements and SqlBulkCopy?

江枫思渺然 提交于 2019-12-12 10:24:04
问题 I have a problem of inserting huge amount of data to SQL server. Previously I was using Entity framework, but it was damn slow for just 100K root level records ( containing separately two distinct collections, where each one is further operating on 200K records roughly ) = roughly 500K-600K records in memory. Here I applied all optimization ( e.g AutoDetectChangesEnabled = false, and recreated and disposed the context after each batch. ) I rejected the solution, and used BulkInsert that's

Dapper - Bulk insert of new items and get back new IDs

这一生的挚爱 提交于 2019-12-12 09:57:18
问题 I am using dapper to add multiple new students in one db hit using this method: db.ExecuteAsync(@"INSERT Student(Name,Age) values (@Name,@Age)", students.Select(s => new { Name = s.Name, Age = s.Age }) ); But the problem I don't have the new ids. Can I make one db hit and still some how get the new ids ? And if not, what is the most efficient way of performing such bulk insert ? 回答1: That is not a bulk insert; it is basically just shorthand that unrolls the loop; although interestingly

Delete All / Bulk Insert

十年热恋 提交于 2019-12-12 08:48:18
问题 First off let me say I am running on SQL Server 2005 so I don't have access to MERGE . I have a table with ~150k rows that I am updating daily from a text file. As rows fall out of the text file I need to delete them from the database and if they change or are new I need to update/insert accordingly. After some testing I've found that performance wise it is exponentially faster to do a full delete and then bulk insert from the text file rather than read through the file line by line doing an