load-data-infile

load data infile, dealing with fields with comma

◇◆丶佛笑我妖孽 提交于 2019-12-04 09:59:55
How do we deal with field with comma when using load data infile? i have this query: $sql = "LOAD DATA LOCAL INFILE '{$file}' INTO TABLE sales_per_pgs FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 LINES (@user_id, @account_code, @pg_code, @sales_value) SET user_id = @user_id, account_code = @account_code, product_group_code = @pg_code, sales_value = REPLACE(@sales_value, ',', ''), company_id = {$company_id}, year = {$year}, month = {$month}"; and a line from the csv looks like this: 139, pg89898, op89890, 1,000,000.00 where 1,000,000.00 is a sales value. Currently, what is

Importing bulk CSV data in UTF-8 into MySQL

这一生的挚爱 提交于 2019-12-04 03:48:22
I'm trying to import about 10K rows of UTF-8 encoded data into a new MySQL table. I can do so successfully with LOAD DATA INFILE via MySQL Workbench but it the UTF-8 characters get mangled. I've tested the database otherwise via PHP and it accepts stores UTF-8 charaters fine. The problem seems to be with LOAD DATA INFILE , and I've come across a few threads about this. Does anyone know a workaround, or possibly another similarly easy method to import CSV data? Thank you. RESOLVED: For others who see this and have the same problem, just add character set as a parameter when running LOAD DATA

MYSQL: Display Skipped records after LOAD DATA INFILE?

这一生的挚爱 提交于 2019-12-04 00:10:42
In MySQL I've used LOAD DATA LOCAL INFILE which works fine. At the end I get a message like: Records: 460377 Deleted: 0 Skipped: 145280 Warnings: 0 How can I view the line number of the records that were skipped? SHOW warnings doesn't work: mysql> show warnings; Empty set (0.00 sec) If there was no warnings, but some rows were skipped, then it may mean that the primary key was duplicated for the skipped rows. The easiest way to find out duplicates is by openning the local file in excel and performing a duplicate removal on the primary key column to see if there are any. You could create a temp

Importing CSV data using PHP/MySQL - Mysqli syntax

霸气de小男生 提交于 2019-12-03 20:46:23
IN THE BOTTOM OF THIS QUESTION THE FINAL CODE THAT FINALLY WORKED! Trying to implement this ( Importing CSV data using PHP/MySQL ). I must be almost there... notes1: my $sql came straight from copy/paste phpmyadmin (generate php code) and ran just fine in the phpmyadmin. note2: If I comment the line $sql="DELETE FROM dbase" the code runs just fine (and the table is cleaned). So if i know my sql is right and my code can run other sqls, why does the below does not run?! Im getting: Call to a member function execute() on a non-object - for the line $stmt->execute(); Full code: <?php $mysqli = new

What mysql settings affect the speed of LOAD DATA INFILE?

怎甘沉沦 提交于 2019-12-03 15:23:41
Let me set up the situation. We are trying to insert a modestly high number of rows (roughly 10-20M a day) into a MyISAM table that is modestly wide: +--------------+--------------+------+-----+---------+-------+ | Field | Type | Null | Key | Default | Extra | +--------------+--------------+------+-----+---------+-------+ | blah1 | varchar(255) | NO | PRI | | | | blah2 | varchar(255) | NO | PRI | | | | blah3 | varchar(5) | NO | PRI | | | | blah4 | varchar(5) | NO | PRI | | | | blah5 | varchar(2) | NO | PRI | | | | blah6 | varchar(2) | NO | PRI | | | | blah7 | date | NO | PRI | | | | blah8 |

How to see progress of .csv upload in MySQL

我们两清 提交于 2019-12-03 11:57:41
I have a very large .csv file, and I'm loading it into mysql with the LOAD DATA INFILE command. Because it takes so long, I'd like to see how far along the upload has progressed. I've tried 2 methods so far- First I simply did a SELECT COUNT(*) command to see how many rows had been inserted as the upload was in progress, but that always returns a count of 0. Second, I tried SHOW PROCESSLIST and saw simply how long the query has been running. sometimes the status says 'freeing data' or something to that effect. Does anyone know a good way to track the progress of a LOAD DATA INFILE command?

Import Large CSV file into MySQL

徘徊边缘 提交于 2019-12-03 09:13:58
I am trying to import a csv file into a mysql table and I currently have a script that is running line by line because I need to hash an id combined with another id as well as format the date for mysql format. The csv file has MORE columns than I am currently importing. Is it easier to just import all columns? I was reading about LOAD DATA INFILE (http://dev.mysql.com/doc/refman/5.1/en/load-data.html), but I am wondering how I can use this and hash the ids and format the date without doing row by row execution. My current script is taking way too long and causing site performance issues while

Is it possible to use a LOAD DATA INFILE type command to UPDATE rows in the db?

我怕爱的太早我们不能终老 提交于 2019-12-02 21:13:39
Pseudo table: | primary_key | first_name | last_name | date_of_birth | | 1 | John Smith | | 07/04/1982 | At the moment first_name contains a users full name for many rows. The desired outcome is to split the data, so first_name contains "John" and last_name contains "Smith". I have a CSV file which contains the desired format of data: | primary_key | first_name | last_name | | 1 | John | Smith | Is there a way of using the LOAD DATA INFILE command to process the CSV file to UPDATE all rows in this table using the primary_key - and not replace any other data in the row during the process (i.e.

MYSQL LOAD DATA INFILE

喜你入骨 提交于 2019-12-02 08:31:21
I use LOAD DATA INFILE in mysql. In my input file I have "x" character but I have to save it to database as NULL. How can I do it? load data infile... into table ... fields terminated by... lines terminated by ... ( my_field... ) set my_field = if(my_field = 'x', null, my_field); Alock Leo load data infile... into table ... fields terminated by... lines terminated by ... ( @var... ) set my_field = if(@var = 'x', null, @var); Thanks to pekka 来源: https://stackoverflow.com/questions/4723522/mysql-load-data-infile

LOAD DATA INFILE: Invalid ut8mb4 character string

馋奶兔 提交于 2019-12-02 08:10:55
问题 I'm using LOAD DATA INFILE to import some big tables (iTunes EPF). However, the import fails with this error: string(52) "Invalid utf8mb4 character string: 'אל נא תלך'" The table is created like this: CREATE TABLE `song-tmp` ( `song_id` int(11) NOT NULL DEFAULT '0', `name` varchar(1000) DEFAULT NULL, `title_version` varchar(1000) DEFAULT NULL, `artist_display_name` varchar(1000) DEFAULT NULL, PRIMARY KEY (`song_id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8mb4; This is the import query I'm using: