How to write into PostgreSQL hstore using Spark Dataset

后端 未结 2 906
予麋鹿
予麋鹿 2021-01-18 11:23

I\'m trying to write a Spark Dataset into an existent postgresql table (can\'t change the table metadata like column types). One of the columns of this table is of type HSto

相关标签:
2条回答
  • 2021-01-18 12:09

    This is a pyspark code for writing a dataframe to a Postgres Table that has HSTORE JSON and JSONB columns. So in general for any complicated datatypes that have been created in Postgres which can't be created in Spark Dataframe, you need to specify stringtype="unspecified" in the options or in the properties that you are setting to any write dataframe to SQL function.

    Below is an example of writing a Spark Dataframe to PostgreSQL table using write() function:

    dataframe.write.format('jdbc').options(driver=driver,user=username,password=password, url=target_database_url,dbtable=table, stringtype="unspecified").mode("append").save()
    
    0 讨论(0)
  • 2021-01-18 12:19

    It turns out I have just to let postgres try to guess the appropriate type of my column. By setting stringtype to unspecified in the connection string as described in the official documentation.

    props.put("stringtype", "unspecified")
    

    Now it works perfectly !!

    0 讨论(0)
提交回复
热议问题