PySpark No suitable driver found for jdbc:mysql://dbhost

后端 未结 2 2060
时光取名叫无心
时光取名叫无心 2021-01-05 16:21

I am trying to write my dataframe to a mysql table. I am getting No suitable driver found for jdbc:mysql://dbhost when I try write.

As part of the prep

相关标签:
2条回答
  • 2021-01-05 16:25

    This is a bug related the the classloader. This is the ticket for it: https://issues.apache.org/jira/browse/SPARK-8463 and this is the pull request for it: https://github.com/apache/spark/pull/6900.

    A workaround is to copy mysql-connector-java-5.1.35-bin.jar to every machine at the same location as it is on the driver.

    0 讨论(0)
  • 2021-01-05 16:49

    It seems that you may have triggered a bug in Spark SQL. There seems to be a fix, the commit is e991255e7203a0f7080efbd71f57574f46076711 (see https://mail-archives.apache.org/mod_mbox/spark-commits/201505.mbox/%3C6ec5793fb810447388384f3ac03ca670@git.apache.org%3E ) and it describes the problem as "The problem is in java.sql.DriverManager class that can't access drivers loaded by Spark ClassLoader." Probably the simplest solution is to try the latest version from master, or failing that cherry-pick the commit into your branch.

    0 讨论(0)
提交回复
热议问题