To ignore duplicate keys during 'copy from' in postgresql

前端 未结 4 867
天命终不由人
天命终不由人 2020-12-07 14:11

I have to dump large amount of data from file to a table PostgreSQL. I know it does not support \'Ignore\' \'replace\' etc as done in MySql. Almost all posts regarding this

相关标签:
4条回答
  • 2020-12-07 14:16

    PostgreSQL 9.5 now has upsert functionality. You can follow Igor's instructions, except that final INSERT includes the clause ON CONFLICT DO NOTHING.

    INSERT INTO main_table
    SELECT *
    FROM tmp_table
    ON CONFLICT DO NOTHING
    
    0 讨论(0)
  • 2020-12-07 14:28

    Igor’s answer helped me a lot, but I also ran into the problem Nate mentioned in his comment. Then I had the problem—maybe in addition to the question here—that the new data did not only contain duplicates internally but also duplicates with the existing data. What worked for me was the following.

    CREATE TEMP TABLE tmp_table AS SELECT * FROM newsletter_subscribers;
    COPY tmp_table (name, email) FROM stdin DELIMITER ' ' CSV;
    SELECT count(*) FROM tmp_table;  -- Just to be sure
    TRUNCATE newsletter_subscribers;
    INSERT INTO newsletter_subscribers
        SELECT DISTINCT ON (email) * FROM tmp_table
        ORDER BY email, subscription_status;
    SELECT count(*) FROM newsletter_subscribers;  -- Paranoid again
    

    Both internal and external duplicates become the same in the tmp_table and then the DISTINCT ON (email) part removes them. The ORDER BY makes sure that the desired row comes first in the result set and DISTINCT then discards all further rows.

    0 讨论(0)
  • 2020-12-07 14:29

    Use the same approach as you described, but DELETE (or group, or modify ...) duplicate PK in the temp table before loading to the main table.

    Something like:

    CREATE TEMP TABLE tmp_table 
    ON COMMIT DROP
    AS
    SELECT * 
    FROM main_table
    WITH NO DATA;
    
    COPY tmp_table FROM 'full/file/name/here';
    
    INSERT INTO main_table
    SELECT DISTINCT ON (PK_field) *
    FROM tmp_table
    ORDER BY (some_fields)
    

    Details: CREATE TABLE AS, COPY, DISTINCT ON

    0 讨论(0)
  • 2020-12-07 14:34

    Insert into a temp table grouped by the key so you get rid of the duplicates

    and then insert if not exists

    0 讨论(0)
提交回复
热议问题