mysqlimport

mysqlimport issues “set @@character_set_database=binary” which prevents loading json values

落爺英雄遲暮 提交于 2019-12-23 09:54:45
问题 I have been using mysqlimport without problems for a long time, now as mysql 5.7 added json data type support, I'm trying to use mysqlimport with rows containing json data. Here is an example of a row in csv file that will be imported using mysqlimport: column_A_value,column_B_value,[{"x":20,"y":"some name"}] Notice that the last column type is json. Now when using mysqlimport as the following: mysqlimport -u user -ppass -h localhost --columns='col_A,col_B,col_C' --local --fields-terminated

I am getting all data imported in MySQL from CSV file in the first column. All other columns gets NULL values

时光毁灭记忆、已成空白 提交于 2019-12-22 13:58:33
问题 I am a using MySQL locally to my Mac. I have a CSV file all fields enclosed in double quotes. Here is an extract of the first rows of my CSV file: $ head -5 companies.csv "COMPANY_ADDRESS1","COMPANY_ADDRESS2","COMPANY_CITY","COMPANY_COUNTRY","COMPANY_FAX","COMPANY_GERMANY","COMPANY_ID","COMPANY_INDIANAPOLIS","COMPANY_NAME","COMPANY_STATE","COMPANY_TELEPHONE","COMPANY_VAT","COMPANY_ZIP" "980 Madison Avenue","6th Floor","New York","USA","","","1393","","Lucky Fives LLC","NY","212-74-2313","",

dumping a mysql table to CSV (stdout) and then tunneling the output to another server

二次信任 提交于 2019-12-22 10:29:24
问题 I'm trying to move a database table to another server; the complication is that the machine currently running the table has little to no space left; so I'm looking for a solution that can work over the net. I have tried mysqldumping the database from the src machine and piping it into mysql at the dest; but my database has 48m rows and even when turning auto_commit off & trx_commit cmd to 2; I am getting some dog slow times. mysqldump -uuser -ppass --opt dbname dbtable | mysql -h remove

How to import a mysql dump while renaming some tables/columns and not importing others at all?

给你一囗甜甜゛ 提交于 2019-12-19 04:05:52
问题 I'm importing a legacy db to a new version of our program, and I'm wondering if there's a way to not import some columns/tables from the dump, and rename other tables/columns as i import them? I'm aware I could edit the dump file in theory, but that seems like a hack, and so far none of my editors can handle opening the 1.3 gb file (Yes, I've read the question about that on here. No, none of the answers worked for me so far.). Suggestions? 回答1: It's possible to not import some tables by

How to import LARGE sql files into mysql table

别来无恙 提交于 2019-12-18 05:03:25
问题 I have a php script that parses XML files and creates a large SQL file that looks something like this: INSERT IGNORE INTO table(field1,field2,field3...) VALUES ("value1","value2",int1...), ("value1","value2",int1)...etc This file adds up to be over 20GB (I've tested on a 2.5GB file but it fails too). I've tried commands like: mysql -u root -p table_name < /var/www/bigfile.sql this works on smaller files, say around 50MB. but it doesn't work with a larger file. I tried: mysql> source /var/www

Mysqlimport from pipe

ⅰ亾dé卋堺 提交于 2019-12-12 13:27:44
问题 I'm trying to figure out how to pipe output into mysqlimport without any luck. I have a huge file (~250 GB) that I want to pipe to mysqlimport after processing it. I don't want to create an intermediate file/table. I'm imagining something like this: cat genome.mpileup | nawk 'sub("^...","")' | mysqlimport -uuser -ppassword Database But obviously this isn't working. Any suggestions on how to accomplish this? 回答1: It doesn't look like mysqlimport can read from STDIN but you can perhaps

Importing text to MySQL: strange format

谁说胖子不能爱 提交于 2019-12-12 03:42:42
问题 I'm importing some data from a .txt file into a MySQL database table, using mysqlimport. It seems to import OK (no error messages) but looks very odd when displayed, and can't be searched as expected. Here are the details. The original text file is saved in UTF-8, with records that look (in a text editor) like this. The second field includes line breaks: WAR-16,52 ~~~~~ Lorem ipsum dolor sit. Lorem ipsum dolor sit. ~~~~~ ENDOFRECORD WAR-16,53~~~~~Lorem ipsum dolor sit. Lorem ipsum dolor sit.

mysqlimport NULL values when importing from delimited file

懵懂的女人 提交于 2019-12-11 10:44:29
问题 I am running mysqlimport to load data and am having an issue with empty values not being loaded as null. I am using mysql 5.7.23 Issue can be recreated by creating a file named mytable.psv, containing: code|date_a|date_b 1|2018-11-27|2018-11-27 2|2018-11-27| Then run the commands: mysql -u root -e "CREATE DATABASE mydb" mysql -u root -e "USE mydb; CREATE TABLE mytable (code varchar(15) NOT NULL, date_a date NOT NULL, date_b date);" mysqlimport --ignore-lines=1 --fields-terminated-by='|' -

mysqlimport without mysql installation?

妖精的绣舞 提交于 2019-12-11 06:46:56
问题 I'm trying to use mysqlimport to import text files to a mysql database. The problem is that the linux box I am importing the text files from will not have an installation of mysql and I am importing these files to a database on a different server. Does mysqlimport need to have a full installation of mysql to work? Can I just bring over the mysqlimport exe and some libraries? Cheers, Kaiser 回答1: Yes, you can use it standalone, among the options described here you can specify host to connect to

How to insert selected multiple checkbox values into different rows to mysql database in android and php

孤人 提交于 2019-12-11 05:49:56
问题 Here i am accessing contacts from the phone and displaying in custom listview. Now i have to insert the selected checkbox values to different rows to mysql database.Here i can able to insert single checkbox value to the database..if i selected more than one value, the data will be stored in the single row..What modifications i have to make to insert the single data to each row..?i am using php file to store data to the database. DisplayContact.java public class DisplayContact extends Activity