Save PL/pgSQL output from PostgreSQL to a CSV file

后端 未结 18 2017
名媛妹妹
名媛妹妹 2020-11-22 11:56

What is the easiest way to save PL/pgSQL output from a PostgreSQL database to a CSV file?

I\'m using PostgreSQL 8.4 with pgAdmin III and PSQL plugin where I run que

相关标签:
18条回答
  • 2020-11-22 12:03

    New version - psql 12 - will support --csv.

    psql - devel

    --csv

    Switches to CSV (Comma-Separated Values) output mode. This is equivalent to \pset format csv.


    csv_fieldsep

    Specifies the field separator to be used in CSV output format. If the separator character appears in a field's value, that field is output within double quotes, following standard CSV rules. The default is a comma.

    Usage:

    psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres
    
    psql -c "SELECT * FROM pg_catalog.pg_tables" --csv -P csv_fieldsep='^'  postgres
    
    psql -c "SELECT * FROM pg_catalog.pg_tables" --csv  postgres > output.csv
    
    0 讨论(0)
  • 2020-11-22 12:04

    There are several solutions:

    1 psql command

    psql -d dbname -t -A -F"," -c "select * from users" > output.csv

    This has the big advantage that you can using it via SSH, like ssh postgres@host command - enabling you to get

    2 postgres copy command

    COPY (SELECT * from users) To '/tmp/output.csv' With CSV;

    3 psql interactive (or not)

    >psql dbname
    psql>\f ','
    psql>\a
    psql>\o '/tmp/output.csv'
    psql>SELECT * from users;
    psql>\q
    

    All of them can be used in scripts, but I prefer #1.

    4 pgadmin but that's not scriptable.

    0 讨论(0)
  • 2020-11-22 12:04

    I'm working on AWS Redshift, which does not support the COPY TO feature.

    My BI tool supports tab-delimited CSVs though, so I used the following:

     psql -h dblocation -p port -U user -d dbname -F $'\t' --no-align -c "SELECT * FROM TABLE" > outfile.csv
    
    0 讨论(0)
  • 2020-11-22 12:06

    In pgAdmin III there is an option to export to file from the query window. In the main menu it's Query -> Execute to file or there's a button that does the same thing (it's a green triangle with a blue floppy disk as opposed to the plain green triangle which just runs the query). If you're not running the query from the query window then I'd do what IMSoP suggested and use the copy command.

    0 讨论(0)
  • 2020-11-22 12:06
    import json
    cursor = conn.cursor()
    qry = """ SELECT details FROM test_csvfile """ 
    cursor.execute(qry)
    rows = cursor.fetchall()
    
    value = json.dumps(rows)
    
    with open("/home/asha/Desktop/Income_output.json","w+") as f:
        f.write(value)
    print 'Saved to File Successfully'
    
    0 讨论(0)
  • 2020-11-22 12:11

    In terminal (while connected to the db) set output to the cvs file

    1) Set field seperator to ',':

    \f ','
    

    2) Set output format unaligned:

    \a
    

    3) Show only tuples:

    \t
    

    4) Set output:

    \o '/tmp/yourOutputFile.csv'
    

    5) Execute your query:

    :select * from YOUR_TABLE
    

    6) Output:

    \o
    

    You will then be able to find your csv file in this location:

    cd /tmp
    

    Copy it using the scp command or edit using nano:

    nano /tmp/yourOutputFile.csv
    
    0 讨论(0)
提交回复
热议问题