bulkinsert

How to read millions of rows from the text file and insert into table quickly

夙愿已清 提交于 2019-12-20 00:50:12
问题 I have gone through the Insert 2 million rows into SQL Server quickly link and found that I can do this by using Bulk insert. So I am trying to create the datatable (code as below), but as this is a huge file (more than 300K row) I am getting an OutOfMemoryEexception in my code: string line; DataTable data = new DataTable(); string[] columns = null; bool isInserted = false; using (TextReader tr = new StreamReader(_fileName, Encoding.Default)) { if (columns == null) { line = tr.ReadLine();

Minimally Logged Insert Into

匆匆过客 提交于 2019-12-19 19:49:00
问题 I have an INSERT statement that is eating a hell of a lot of log space, so much so that the hard drive is actually filling up before the statement completes. The thing is, I really don't need this to be logged as it is only an intermediate data upload step. For argument's sake, let's say I have: Table A: Initial upload table (populated using bcp , so no logging problems) Table B: Populated using INSERT INTO B from A Is there a way that I can copy between A and B without anything being written

Bulk Insert Multiple XML files with SSIS 2008

佐手、 提交于 2019-12-19 12:23:38
问题 I have a folder with multiple XML files. I need to bulk insert each one into a table in sql server. I am at a complete loss as to how to get this to work, as I am new to SSIS. Currently, My SSIS package pulls the files off an FTP server and uses a command line to unzip the xml (the come as .xml.gz). This all works great, but now I'm at a loss as to get the files into the database, as the bulk insert task only takes delimited files. Suggestions? 回答1: You can accomplish this by using a ForEach

How can I specify the path to a file dynamically in OPENROWSET(BULK…)?

↘锁芯ラ 提交于 2019-12-19 08:54:18
问题 I want to insert images into an Image field, preferably using a stored procedure which will accept a path to an image. After hacking around I came up with this; -- functional DECLARE @parameters nvarchar(max) = ''; DECLARE @sql_string nvarchar(max) = N'UPDATE MyTable SET MyImageField = (SELECT BulkColumn FROM Openrowset(Bulk ''' + @PathToMyImage + ''', Single_Blob) ImageData) WHERE MyPrimaryKey = ' + CAST(@PrimaryKey AS NVARCHAR(max)); EXECUTE sp_executesql @sql_string, @parameters I did this

Bulk insert rowterminator issue

假装没事ソ 提交于 2019-12-18 21:07:10
问题 I have this csv named test.csv with the content below 1,"test user",,,4075619900,example@example.com,"Aldelo for Restaurants","this is my deal",,"location4" 2,"joe johnson",,"32 bit",445555519,antle@gmail.com,"Restaurant Pro Express","smoe one is watching u",,"some location" Here is my SQL FILE to do the BULK insert USE somedb GO CREATE TABLE CSVTemp (id INT, name VARCHAR(255), department VARCHAR(255), architecture VARCHAR(255), phone VARCHAR(255), email VARCHAR(255), download VARCHAR(255),

Can Sql Server BULK INSERT read from a named pipe/fifo?

瘦欲@ 提交于 2019-12-18 16:45:20
问题 Is it possible for BULK INSERT/bcp to read from a named pipe, fifo-style? That is, rather than reading from a real text file, can BULK INSERT/bcp be made to read from a named pipe which is on the write end of another process? For example: create named pipe unzip file to named pipe read from named pipe with bcp or BULK INSERT or: create 4 named pipes split 1 file into 4 streams, writing each stream to a separate named pipe read from 4 named pipes into 4 tables w/ bcp or BULK INSERT The closest

How to bulk insert into MySQL using C#

穿精又带淫゛_ 提交于 2019-12-18 08:09:33
问题 I have previously used the SQLBulkCopy class to load data into a MS SQL Server db. The results were very good, and worked exactly as I intended it to. Now, I'm trying to use a script task in SSIS to bulk load data into a MySQL (5.5.8) database using either an ODBC or ADO.NET connection (recommend?). The columns in my dataset correspond with the columns of the MySQL table. What is the best way to do a bulk insert of a dataset into a MySQL database? 回答1: You can use the MySqlBulkLoader shipped

MySQL LOAD DATA Error (Errcode: 2 - “No such file or directory”)

回眸只為那壹抹淺笑 提交于 2019-12-18 06:54:47
问题 I am trying to load data into a table of my MySQL database, and getting this error. LOAD DATA LOCAL INFILE 'C:\Users\Myself\Desktop\Blah Blah\LOAD DATA\week.txt' INTO TABLE week; Reference: this The path is hundred percent correct, I copied it by pressing shift and clicking "copy path as" and checked it many times. So any tips on this will be much appreciated . . My research: Seeing this answer, I tried by changing C:\Users to C:\\Users . It did not work for me. Secondly, is there a way to

How to BULK INSERT a file into a *temporary* table where the filename is a variable?

和自甴很熟 提交于 2019-12-18 01:30:13
问题 I have some code like this that I use to do a BULK INSERT of a data file into a table, where the data file and table name are variables: DECLARE @sql AS NVARCHAR(1000) SET @sql = 'BULK INSERT ' + @tableName + ' FROM ''' + @filename + ''' WITH (CODEPAGE=''ACP'', FIELDTERMINATOR=''|'')' EXEC (@sql) The works fine for standard tables, but now I need to do the same sort of thing to load data into a temporary table (for example, #MyTable ). But when I try this, I get the error: Invalid Object Name

How to Bulk Insert from XLSX file extension?

谁说我不能喝 提交于 2019-12-17 16:59:13
问题 Can anyone advise how to bulk insert from .xlsx file? I tried the below query already: BULK INSERT #EVB FROM 'C:\Users\summer\Desktop\Sample\premise.xlsx' WITH (FIELDTERMINATOR = '\t', ROWTERMINATOR = '\n', FIRSTROW = 2); SELECT * FROM #EVB I also tried with FIELDTERMINATOR like "**\t**", "**,**", "**;**", "**|**" , but this doesn't work either. Unfortunately, there is no error message. 回答1: you can save the xlsx file as a tab-delimited text file and do BULK INSERT TableName FROM 'C: