bulkinsert

Can MySqlBulkLoader be used with a transaction?

末鹿安然 提交于 2019-12-30 09:00:22
问题 Can MySqlBulkLoader be used with a transaction? I don't see a way to explicitly attach a transaction to an instance of the loader. Is there another way? 回答1: As stated here by member of MySQL documentation team: It's not atomic. The records loaded prior to the error will be in the table. Work arround is to import data to dedicated table and then execute INSERT INTO ... SELECT ... which will be atomic operation. On huge data sets this is potential problem becasue of long transaction. 回答2: The

Can MySqlBulkLoader be used with a transaction?

删除回忆录丶 提交于 2019-12-30 09:00:09
问题 Can MySqlBulkLoader be used with a transaction? I don't see a way to explicitly attach a transaction to an instance of the loader. Is there another way? 回答1: As stated here by member of MySQL documentation team: It's not atomic. The records loaded prior to the error will be in the table. Work arround is to import data to dedicated table and then execute INSERT INTO ... SELECT ... which will be atomic operation. On huge data sets this is potential problem becasue of long transaction. 回答2: The

postgresSQL insert multiple rows, of id returned from select queries

社会主义新天地 提交于 2019-12-25 04:06:28
问题 I have a complex query that join multiple tables and return many member ids (line 5) For each memberId I want to insert a memberSegment record, consisting of the memberId (new for each insert) and a segmentId (always the same/not list) INSERT INTO db."memberSegment"( "memberId", "segmentId") VALUES ( (select table."memberId" complex query returns many ids ), (SELECT id FROM db.segment where "idName" = 'due-for-360') ); From reading on SO this is how I interpret it should look, but I am

Getting PK of updated records in cakephp/sql server

落花浮王杯 提交于 2019-12-25 01:55:58
问题 I have used bulk update to sync datas between two tables as below $sqlProc=" UPDATE cards SET cards.card_no = t2.card_number, cards.expiry_date=t2.expiry_date OUTPUT INSERTED.Id AS 'updated_id' FROM cards INNER JOIN card_temp t2 ON (cards.account_no = t2.account_number AND cards.customer_name=t2.customer_name) WHERE cards.is_disabled='N'"; debug($this->Request->query($sqlProc)); Above query will also return Primary key of updated records using OUTPUT INSERTED.Id AS 'updated_id' in sql server

MySqlBulkLoader with Column Mapping?

假装没事ソ 提交于 2019-12-25 01:47:09
问题 I use SqlBulkCopy to do bulk inserts into a SQL Server database. I am now providing MySql support for my program and the nearest thing to SqlBulkCopy is MySqlBulkLoader . But in MySqlBulkLoader , I have to first convert my DataTable to a file because MySqlBulkLoader only works with files and not DataTable . And then I have to disable foreign key checks before the insert. I have done them both but now I am left with one more problem: My destination table has an identity column (auto-increment

Bulk insert, Cannot bulk load. Unknown version of format file

跟風遠走 提交于 2019-12-24 22:19:49
问题 I'm having problems importing data exported using a more recent version of the bcp utility. I need my exported data to be compatible with SQL server 2012. It is my understanding that I need to add the -V110 option when exporting data. However, even if I add the -V110 to the bcp command I always get version 12.0 (which is my bcp version) on the FMT file. Here is a sample of my bcp call: bcp [DB].[dbo].[Table1] format nul -c -f "E:\TMP\DATA\Table1.FMT" -V110 -S [SERVER] -t , -T Does anybody

Transaction not connected, or was disconnected error

ぃ、小莉子 提交于 2019-12-24 16:27:31
问题 I am getting "Transaction not connected, or was disconnected error" error when the transaction is either committed/rolled back after doing a bulk insert (along with some other operations). using(var tran = Session.Session().BeginTransaction(IsolationLevel.Serializable)) { // do something fullSession.Session().CreateSQLQuery(query).ExecuteUpdate();// this query bulk insert in a temp db // do something else tran.Commit()/ tran.RollBack();// if transaction is active and not already rolled back

INSERT IGNORE or INSERT WHERE NOT IN

走远了吗. 提交于 2019-12-24 12:22:02
问题 I have a 9 million rows table and I'm struggling to handle all this data because of its sheer size. What I want to do is add IMPORT a CSV to the table without overwriting data. Before I would of done something like this; INSERT if not in(select email from tblName where source = "number" and email != "email") INTO (email...) VALUES ("email"...) But I'm worried that I'll crash the server again. I want to be able to insert 10,000s of rows into a table but only if its not in the table with source

How to improve speed of thousand of inserts in a Nested Set model?

拈花ヽ惹草 提交于 2019-12-24 05:37:11
问题 I'm using MySQL. I want to insert 100.000 places who belong to a Nested Set Model with different levels (root_id, lft, rgt, level). If I didn't have to worry about the tree, I would do a bulk insert; but in this case, I have to do 3 queries for each object on this order: foreach ( $places as $place ) { # get parent place from $ancestor_array ( woeid => ids ) $parent = $ancestors[:parent_woeid] update place set lft = lft + 2 where root_id = :root_id and lft >= :obj_lft; update place set rgt =

PyMongo’s bulk write operation features with generators

微笑、不失礼 提交于 2019-12-24 03:27:11
问题 I would like to use PyMongo’s bulk write operation features which executes write operations in batches in order to reduces the number of network round trips and increaseses rite throughput. I also found here that it was possible to used 5000 as a batch number. However, I do not want is the best size for batch number and how to combine PyMongo’s bulk write operation features with generators in the following code? from pymongo import MongoClient from itertools import groupby import csv def iter