Saving / exporting transformed DataFrame back to JDBC / MySQL

前端 未结 1 956
时光取名叫无心
时光取名叫无心 2021-02-09 06:05

I\'m trying to figure out how to use the new DataFrameWriter to write data back to a JDBC database. I can\'t seem to find any documentation for this, although looki

1条回答
  •  不知归路
    2021-02-09 06:21

    Update

    Current Spark version (2.0 or later) supports table creation on write.

    The original answer

    It is possible to write to an existing table but it looks like at this moment (Spark 1.5.0) creating table using JDBC data source is not supported yet*. You can check SPARK-7646 for reference.

    If table already exists you can simply use DataFrameWriter.jdbc method:

    val prop: java.util.Properties = ???
    df.write.jdbc("jdbc:mysql://localhost/foo", "foo.bar2", prop)
    

    * What is interesting PySpark seems to support table creation using jdbc method.

    0 讨论(0)
提交回复
热议问题