I am trying to write my dataframe to a mysql table. I am getting No suitable driver found for jdbc:mysql://dbhost
when I try write.
As part of the prep
This is a bug related the the classloader. This is the ticket for it: https://issues.apache.org/jira/browse/SPARK-8463 and this is the pull request for it: https://github.com/apache/spark/pull/6900.
A workaround is to copy mysql-connector-java-5.1.35-bin.jar to every machine at the same location as it is on the driver.
It seems that you may have triggered a bug in Spark SQL. There seems to be a fix, the commit is e991255e7203a0f7080efbd71f57574f46076711
(see https://mail-archives.apache.org/mod_mbox/spark-commits/201505.mbox/%3C6ec5793fb810447388384f3ac03ca670@git.apache.org%3E ) and it describes the problem as "The problem is in java.sql.DriverManager
class that can't access drivers loaded by Spark ClassLoader." Probably the simplest solution is to try the latest version from master, or failing that cherry-pick the commit into your branch.