bulkinsert

How to UPDATE table from csv file?

余生长醉 提交于 2020-01-05 10:33:19
问题 How to update table from csv file in PostgreSQL? (version 9.2.4) Copy command is for insert. But I need to update table. How can I update table from csv file without temp table? I don't want to copy to temp table from csv file and update table from temp table. And no merge command like Oracle? 回答1: The simple and fast way is with a temporary staging table, like detailed in this closely related answer: How to update selected rows with values from a CSV file in Postgres? If you don't "want"

How do you increase the number of processes in parallel with Powershell 3?

佐手、 提交于 2020-01-03 18:42:13
问题 I am trying to run 20 processes in parallel. I changed the session as below, but having no luck. I am getting only up to 5 parallel processes per session. $wo=New-PSWorkflowExecutionOption -MaxSessionsPerWorkflow 50 -MaxDisconnectedSessions 200 -MaxSessionsPerRemoteNode 50 -MaxActivityProcesses 50 Register-PSSessionConfiguration -Name ITWorkflows -SessionTypeOption $wo -Force Get-PSSessionConfiguration ITWorkflows | Format-List -Property * Is there a switch parameter to increase the number of

Django `bulk_create` with related objects

邮差的信 提交于 2020-01-03 16:47:50
问题 I have a Django system that runs billing for thousands of customers on a regular basis. Here are my models: class Invoice(models.Model): balance = models.DecimalField( max_digits=6, decimal_places=2, ) class Transaction(models.Model): amount = models.DecimalField( max_digits=6, decimal_places=2, ) invoice = models.ForeignKey( Invoice, on_delete=models.CASCADE, related_name='invoices', null=False ) When billing is run, thousands of invoices with tens of transactions each are created using

Django `bulk_create` with related objects

生来就可爱ヽ(ⅴ<●) 提交于 2020-01-03 16:46:15
问题 I have a Django system that runs billing for thousands of customers on a regular basis. Here are my models: class Invoice(models.Model): balance = models.DecimalField( max_digits=6, decimal_places=2, ) class Transaction(models.Model): amount = models.DecimalField( max_digits=6, decimal_places=2, ) invoice = models.ForeignKey( Invoice, on_delete=models.CASCADE, related_name='invoices', null=False ) When billing is run, thousands of invoices with tens of transactions each are created using

sql server bulk insert nulls into time column

蓝咒 提交于 2020-01-03 09:23:42
问题 I'm having trouble inserting null value from a bulk insert statement. Two columns are nullable and the Id is identity. The int nullable workd out fine, but the time doesn't. Here the bulk statement: BULK INSERT Circulation FROM '.....file.cs' WITH ( FIRSTROW = 2, MAXERRORS = 0, FIELDTERMINATOR = ',', ROWTERMINATOR = '', KEEPNULLS) Here is an extract of the csv: ID, IDStopLine, IDException, Hour, PositionHour, Day ,28, 8, 12:20, 52, 0 ,29, 163, , 1, Meaning that I'm trying to insert nulls in

sql server bulk insert nulls into time column

一笑奈何 提交于 2020-01-03 09:23:28
问题 I'm having trouble inserting null value from a bulk insert statement. Two columns are nullable and the Id is identity. The int nullable workd out fine, but the time doesn't. Here the bulk statement: BULK INSERT Circulation FROM '.....file.cs' WITH ( FIRSTROW = 2, MAXERRORS = 0, FIELDTERMINATOR = ',', ROWTERMINATOR = '', KEEPNULLS) Here is an extract of the csv: ID, IDStopLine, IDException, Hour, PositionHour, Day ,28, 8, 12:20, 52, 0 ,29, 163, , 1, Meaning that I'm trying to insert nulls in

BULK INSERT into specific columns?

时光总嘲笑我的痴心妄想 提交于 2020-01-03 07:26:09
问题 I want to bulk insert columns of a csv file to specific columns of a destination table. Description - destination table has more columns than my csv file. So, I want the csv file columns to go to the right target columns using BULK INSERT. Is this possible ? If yes, then how do I do it ? I saw the tutorial and code at - http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/ and http://www.codeproject.com

bulk insert txt error with ROWTERMINATOR

天涯浪子 提交于 2020-01-03 02:39:06
问题 Have a txt file and have to pass it to sql A bulk insert BULK INSERT table FROM '\ \ 01cends5 \ TestBulk \ a.txt' WITH ( DATAFILETYPE = 'char' FIELDTERMINATOR = '|' ROWTERMINATOR = '\ n ', FIRSTROW = 1, LASTROW = 15 ) But it do not take as a final line ROWTERMINATOR and probe everything and does not work {CR} {LF}{LF}{CR}\ n\ r\ r \ n\ n \ r My txt format is: 0 | 20276708598 | 119302 | 201101 | 000000 | 000000 回答1: It looks like something is wrong with '\r' translation to 0x0A, at least in my

Entity Framework BulkInsert not inserting child entities

不羁的心 提交于 2020-01-01 19:53:11
问题 I have 2 tables INS_Staging_UMLERTransaction(Parent) and INS_Staging_UMLERBlueCard(Child) . I need to insert 1000 records, when I use bulk insert it is inserting only parent table. The following is my code. _indnCon.BulkInsert(_DataToTrans); _indnCon.BulkInsert(_DataToTrans.SelectMany(m => m.INS_Staging_UMLERBlueCard)); _indnCon.BulkSaveChanges(); 回答1: There is no BulkInsert or BulkSaveChanges method in Entity Framework Extended. So in my answer, I will assume you are using Entity Framework

Bulk insert from C# list into SQL Server into multiple tables with foreign key constraints

喜欢而已 提交于 2020-01-01 19:44:30
问题 I am completely clueless with this problem, Any help would be highly appreciated: I have two tables, one is the master data table ( Table A ), the other table ( Table B ) has a foreign key relationship with multiple entries (to be specific 18) for one entry in Table A . I am getting the data in a list and wish to insert it in SQL Server database. I am currently using the below pattern but is taking 14 minutes for inserting 100 rows in Table A and corresponding 18*100 rows in Table B . using