Write DataFrame to mysql table using pySpark

后端 未结 1 1891
暗喜
暗喜 2020-12-16 00:30

I am attempting to insert records into a MySql table. The table contains id and name as columns.

I am doing like below in a

相关标签:
1条回答
  • 2020-12-16 01:16

    Use Spark DataFrame instead of pandas', as .write is available on Spark Dataframe only

    So the final code could be

    data =['103', 'tester_1']
    
    df = sc.parallelize(data).toDF(['id', 'name'])
    
    df.write.format('jdbc').options(
          url='jdbc:mysql://localhost/database_name',
          driver='com.mysql.jdbc.Driver',
          dbtable='DestinationTableName',
          user='your_user_name',
          password='your_password').mode('append').save()
    
    0 讨论(0)
提交回复
热议问题