I am a using MySQL locally to my Mac. I have a CSV file all fields enclosed in double quotes. Here is an extract of the first rows of my CSV file:
$ head -5 companies.csv
"COMPANY_ADDRESS1","COMPANY_ADDRESS2","COMPANY_CITY","COMPANY_COUNTRY","COMPANY_FAX","COMPANY_GERMANY","COMPANY_ID","COMPANY_INDIANAPOLIS","COMPANY_NAME","COMPANY_STATE","COMPANY_TELEPHONE","COMPANY_VAT","COMPANY_ZIP"
"980 Madison Avenue","6th Floor","New York","USA","","","1393","","Lucky Fives LLC","NY","212-74-2313","","10075"
"1209 Orange Street","","Wilmington","USA","","","1394","","Global Five Holdings","DE","-","","19801"
"2020 Ponce de Leon Boulevard","PH2","Coral Gables","USA","","1015110","1395","","Lion Aero Corporation","FL","305-755-5120","","33134"
"100 NE Adams Street","","Peoria","USA","","","1396","","Caterpillar Inc.","IL","+13096755975","","61630"
And here is my import command:
$ mysqlimport --local --ignore-lines=1 my_db companies.csv -u root --fields-enclosed-by='\"' --lines-terminated-by='\r\n' --fields-terminated-by=','
I am getting all records imported, nothing is skipped:
companies: Records: 3063 Deleted: 0 Skipped: 0 Warnings: 104
However, why I am getting all data imported in the first column and all other columns are populated with NULL?
The table was previously created with all columns necessary to support the data to be loaded.
Test schema:
-- drop table if exists aT;
create table aT
( -- address thing
COMPANY_ADDRESS1 varchar(100) not null,
COMPANY_ADDRESS2 varchar(100) not null,
COMPANY_CITY varchar(100) not null,
COMPANY_COUNTRY varchar(100) not null,
COMPANY_FAX varchar(100) not null,
COMPANY_GERMANY varchar(100) not null,
COMPANY_ID varchar(100) not null,
COMPANY_INDIANAPOLIS varchar(100) not null,
COMPANY_NAME varchar(100) not null,
COMPANY_STATE varchar(100) not null,
COMPANY_TELEPHONE varchar(100) not null,
COMPANY_VAT varchar(100) not null,
COMPANY_ZIP varchar(100) not null
);
Command run interactively or via a program:
LOAD DATA INFILE 'c:\\nate\\aT.csv' INTO TABLE aT
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
select * from aT;
or Linux:
LOAD DATA INFILE '/home/nate/aT.csv' INTO TABLE aT
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
No problem bringing the data in
select * from aT; -- view the import
Now for command line (create a .sql of one of the above):
truncate aT; -- run this first
mysql -u root -p so_gibberish2 < c:\nate\run_this1.sql
so_gibberish2
is the database name. Data comes in. Because the mysql client is not subject to bash or other command line escape sequences.
I think mysqlimport is looking for a backslash double quote to enclose fields, and it's not finding any, so it's all one field.
I'm questioning the purpose of the backslash character before the double quote. If this is for Linux bash shell, I don't think the backslash is required to specify the double quote. As a demo:
# echo foo --fields-enclosed-by='\"'
foo --fields-enclosed-by=\"
# echo foo --fields-enclosed-by='"'
foo --fields-enclosed-by="
(This is just a guess, I could be wrong. I've been wrong plenty times before. It's possible that mysqlimport is discarding that backslash character that's being passed to it. The issue might be something else. But for a test, I'd give it whirl without the backslash, and see how big of a smokeball it makes without it.)
Demo environment:
# uname -o -r -s
Linux 3.13.0-83-generic GNU/Linux
# echo $SHELL
/bin/bash
来源:https://stackoverflow.com/questions/37930310/i-am-getting-all-data-imported-in-mysql-from-csv-file-in-the-first-column-all-o