MYSQL: Display Skipped records after LOAD DATA INFILE?

后端 未结 5 2169
被撕碎了的回忆
被撕碎了的回忆 2021-02-19 15:34

In MySQL I\'ve used LOAD DATA LOCAL INFILE which works fine. At the end I get a message like:

Records: 460377  Deleted: 0  Skipped: 145280  Warnings         


        
相关标签:
5条回答
  • 2021-02-19 16:09

    If there was no warnings, but some rows were skipped, then it may mean that the primary key was duplicated for the skipped rows.

    The easiest way to find out duplicates is by openning the local file in excel and performing a duplicate removal on the primary key column to see if there are any.

    0 讨论(0)
  • 2021-02-19 16:17

    You could create a temp table removing the primary key items so that it allows duplications, and then insert the data.

    Construct a SQL statement like

    select count(column_with_duplicates) AS num_duplicates,column_with_duplicates
    from table
    group by column_with_duplicates
    having num_duplicates > 1;
    

    This will show you the rows with redundancies. Another way is to just dump out the rows that were actually inserted into the table, and run a file difference command against the original to see which ones weren't included.

    0 讨论(0)
  • 2021-02-19 16:30

    Records would be skipped, when any database constraint is not met. Check for common ones like

    • Primary key duplication
    • Unique key condition
    • Partition condition
    0 讨论(0)
  • 2021-02-19 16:32

    For anyone stumbling onto to this:

    Another option would be to do a SELECT INTO and diff the two files. For example:

    LOAD DATA LOCAL INFILE 'data.txt' INTO TABLE my_table FIELDS TERMINATED BY '\t' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\r' IGNORE 1 LINES (title, desc, is_viewable);
    
    SELECT title, desc, is_viewable INTO OUTFILE 'data_rows.txt' FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\r' FROM my_table;
    

    Then execute FileMerge (on Mac OS X) data.txt data_rows.txt to see the differences. If you are getting an access denied error when doing the SELECT INTO make sure you:

    GRANT FILE ON *.* TO 'mysql_user'@'localhost';
    flush privileges;
    

    As the root user in the mysql client.

    0 讨论(0)
  • 2021-02-19 16:32

    I use bash command-line to find the duplicate row in the csv file:

    awk -F\, '{print $1$2}' /my/source/file.csv| sort -n| uniq -c| grep -v "^\ *1"
    

    when the two first columns are the primary key.

    0 讨论(0)
提交回复
热议问题