bulkinsert

Upload csv Files Using SQL Server OpenRowSet Function

百般思念 提交于 2020-01-13 06:48:37
问题 I need to upload multiple files (file1, file2, file3...) into tables (Table1, table2, table3...) in a sql server DB using OpenRowset function. All the files are kept in C:\download I use the following query which works fine. INSERT INTO dbo.Table1 SELECT * from OpenRowset('MSDASQL','Driver={Microsoft Text Driver (*.txt;*.csv)};DefaultDir=C:\download;','select * from File1.csv' ) The question is how to pass the file name and table name as parameter. Thanks Tony for your answer. I have put the

multiple column copy format postgresql Node.js

你离开我真会死。 提交于 2020-01-11 12:59:32
问题 I using postgres stream to insert record into postgres , for single column works fine , but what is ideal data format for copy for multiple columns code snippets var sqlcopysyntax = 'COPY srt (starttime, endtime) FROM STDIN delimiters E\'\\t\''; var stream = client.query(copyFrom(sqlcopysyntax)); console.log(sqlcopysyntax) var interndataset = [ ['1', '4'], ['6', '12.074'], ['13.138', '16.183'], ['17.226', '21.605'], ['22.606', '24.733'], ['24.816', '27.027'], ['31.657', '33.617'], ['34.66',

bulk updating a list of values from a list of ids

好久不见. 提交于 2020-01-11 06:07:30
问题 I'm frequently facing this issue, as an Oracle user, playing around with MySql. Be the following situation: a list of ids (1, 2, 3, ..., n) a list of values ('val1', 'val2', 'val3', ..., 'valn') [The values are obviously totally different than these] The 2 previous lists are passed ordered. It means the value passed first corresponds to the id passed first. The objective is to update all the value of the table value having the corresponding id : val1 should update id 1, val2 should update id

bulk updating a list of values from a list of ids

六眼飞鱼酱① 提交于 2020-01-11 06:06:15
问题 I'm frequently facing this issue, as an Oracle user, playing around with MySql. Be the following situation: a list of ids (1, 2, 3, ..., n) a list of values ('val1', 'val2', 'val3', ..., 'valn') [The values are obviously totally different than these] The 2 previous lists are passed ordered. It means the value passed first corresponds to the id passed first. The objective is to update all the value of the table value having the corresponding id : val1 should update id 1, val2 should update id

Mongo Bulk Insert across multiple collections

给你一囗甜甜゛ 提交于 2020-01-11 04:33:10
问题 I see that mongo has bulk insert, but I see nowhere the capability to do bulk inserts across multiple collections. Since I do not see it anywhere I'm assuming its not available from Mongo. Any specific reason for that? 回答1: You are correct in that the bulk API operates on single collections only. There is no specific reason but the APIs in general are collection-scoped so a "cross-collection bulk insert" would be a design deviation. You can of course set up multiple bulk API objects in a

Bulk insert of hundreds of millions of records

元气小坏坏 提交于 2020-01-10 11:41:04
问题 What is the fastest way to insert 237 million records into a table that has rules (for distributing data across child tables)? I have tried or considered: Insert statements. Transactional inserts ( BEGIN and COMMIT ). The COPY FROM command. http://pgbulkload.projects.postgresql.org/ Inserts are too slow (four days) and COPY FROM ignores rules (and has other issues). Example data: station_id,taken,amount,category_id,flag 1,'1984-07-1',0,4, 1,'1984-07-2',0,4, 1,'1984-07-3',0,4, 1,'1984-07-4',0

Bulk insert of hundreds of millions of records

你说的曾经没有我的故事 提交于 2020-01-10 11:37:08
问题 What is the fastest way to insert 237 million records into a table that has rules (for distributing data across child tables)? I have tried or considered: Insert statements. Transactional inserts ( BEGIN and COMMIT ). The COPY FROM command. http://pgbulkload.projects.postgresql.org/ Inserts are too slow (four days) and COPY FROM ignores rules (and has other issues). Example data: station_id,taken,amount,category_id,flag 1,'1984-07-1',0,4, 1,'1984-07-2',0,4, 1,'1984-07-3',0,4, 1,'1984-07-4',0

sql server Bulk insert csv with data having comma

ぃ、小莉子 提交于 2020-01-10 01:17:07
问题 below is the sample line of csv 012,12/11/2013,"<555523051548>KRISHNA KUMAR ASHOKU,AR",<10-12-2013>,555523051548,12/11/2013,"13,012.55", you can see KRISHNA KUMAR ASHOKU,AR as single field but it is treating KRISHNA KUMAR ASHOKU and AR as two different fields because of comma, though they are enclosed with " but still no luck I tried BULK INSERT tbl FROM 'd:\1.csv' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n', FIRSTROW=2 ) GO is there any solution for it? 回答1: The answer is: you can't

MySQL Insert 20K rows in single insert

我的梦境 提交于 2020-01-09 11:11:40
问题 In my table I insert around 20,000 rows on each load. Right now I am doing it one-by-one. From mysql website I came to know inserting multiple rows with single insert query is faster. Can I insert all 20000 in single query? What will happen if there are errors within this 20000 rows? how will mysql handle that? 回答1: If you are inserting the rows from some other table then you can use the INSERT ... SELECT pattern to insert the rows. However if you are inserting the values using INSERT ...

load data infile is not allowed MariaDB

你离开我真会死。 提交于 2020-01-06 06:41:15
问题 I created a PHP script that imports posts from a CSV file into a WordPress website. To do this, I first bulk import the posts into a table of the WP website database and then the PHP script creates the posts. The bulk insert MYSQL query I use is the following: load data local infile '/var/www/vhosts/sitenamehere.test/test.csv' into table test_table character set latin1 fields terminated by ';' lines terminated by '\r\n' ignore 1 lines; When I run the script from the server I get the following