问题
I have a dataframe in Databricks called customerDetails.
+--------------------+-----------+
| customerName| customerId|
+--------------------+-----------+
|John Smith | 0001|
|Jane Burns | 0002|
|Frank Jones | 0003|
+--------------------+-----------+
I would like to be able to copy this from Databricks to a table within Postgres.
I found this post which used psycopg2 to copy individual lines to Postgres, I am trying to copy each row from the dataframe to the postgres table?
import psycopg2
v1 = 'testing_name'
v2 = 'testing_id'
conn = psycopg2.connect(host="HOST_NAME",
port="PORT",
user="USER_NAME",
password="PASSWORD",
database="DATABASE_NAME")
cursor = conn.cursor()
cursor.execute("INSERT INTO customerTable (customerName, customerId) VALUES(%s, %s)", (v1, v2))
conn.commit()
cursor.close()
conn.close()
回答1:
You can insert, row by row, all the data into your table.
See the documentation for cursor.executemany too as you can reorder your data as a list of tuples and pass the list as the last argument.
The code will be almost identical to the example you gave
cursor = conn.cursor()
def append_to_table(row):
cursor.execute("INSERT INTO customerTable (customerName, customerId) VALUES(%s, %s)", (row.customerName, row.customerId))
df.rdd.map(append_to_table)
conn.commit()
cursor.close()
conn.close()
来源:https://stackoverflow.com/questions/50005707/write-to-postgres-from-dataricks-using-python