During an ETL process I needed to extract and load a JSON column from one Postgres database to another. We use Pandas for this since it has so many ways to read and write data f
I am unable to comment peralmq's answer, but in case of postgresql JSONB you can use
from sqlalchemy import dialects
dataframe.to_sql(..., dtype={"json_column":dialects.postgresql.JSONB})
I've been searching the web for a solution but couldn't find any so here is what we came up with (there might be better ways but at least this is a start if someone else runs into this).
Specify the dtype
parameter in to_sql
.
We went from:df.to_sql(table_name, analytics_db)
to df.to_sql(table_name, analytics_db, dtype={'name_of_json_column_in_source_table': sqlalchemy.types.JSON})
and it just works.
If you (re-)create the JSON column using json.dumps()
, you're all set.
This way the data can be written using pandas' .to_sql()
method, but also the much faster COPY
method of PostgreSQL (via copy_expert()
of psycopg2 or sqlalchemy's raw_connection()
) can be employed.
For the sake of simplicity, let's assume that we have a column of dictionaries that should be written into a JSON(B) column:
import json
import pandas as pd
df = pd.DataFrame([['row1',{'a':1, 'b':2}],
['row2',{'a':3,'b':4,'c':'some text'}]],
columns=['r','kv'])
# conversion function:
def dict2json(dictionary):
return json.dumps(dictionary, ensure_ascii=False)
# overwrite the dict column with json-strings
df['kv'] = df.kv.map(dict2json)