Spark Unable to find JDBC Driver

前端 未结 10 2101
栀梦
栀梦 2020-11-28 08:47

So I\'ve been using sbt with assembly to package all my dependencies into a single jar for my spark jobs. I\'ve got several jobs where I was using c3p0 to setu

相关标签:
10条回答
  • 2020-11-28 09:16

    I had the same problem running jobs over a Mesos cluster in cluster mode.

    To use a JDBC driver is necessary to add the dependency to the system classpath not to the framework classpath. I only found the way of doing it by adding the dependency in the file spark-defaults.conf in every instance of the cluster.

    The properties to add are spark.driver.extraClassPath and spark.executor.extraClassPath and the path must be in the local file system.

    0 讨论(0)
  • 2020-11-28 09:24

    Simple easy way is to copy "mysql-connector-java-5.1.47.jar" into "spark-2.4.3\jars\" directory

    0 讨论(0)
  • 2020-11-28 09:29

    I was facing the same issue when I was trying to run the spark-shell command from my windows machine. The path that you pass for the driver location as well as for the jar that you would be using should be in the double quotes otherwise it gets misinterpreted and you would not get the exact output that you want.

    you also would have to install the JDBC driver for SQL server from the link : JDBC Driver

    I have used the below command for this to work fine for me on my windows machine:

    spark-shell --driver-class-path "C:\Program Files\Microsoft JDBC Driver 6.0 for SQL Server\sqljdbc_6.0\enu\jre8\sqljdbc42.jar" --jars "C:\Program Files\Microsoft JDBC Driver 6.0 for SQL Server\sqljdbc_6.0\enu\jre8\sqljdbc42.jar"

    0 讨论(0)
  • 2020-11-28 09:32

    This person was having similar issue: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-DataFrame-with-MySQL-td22178.html

    Have you updated your connector drivers to the most recent version? Also did you specify the driver class when you called load()?

    Map<String, String> options = new HashMap<String, String>();
    options.put("url", "jdbc:mysql://localhost:3306/video_rcmd?user=root&password=123456");
    options.put("dbtable", "video");
    options.put("driver", "com.mysql.cj.jdbc.Driver"); //here
    DataFrame jdbcDF = sqlContext.load("jdbc", options); 
    

    In spark/conf/spark-defaults.conf, you can also set spark.driver.extraClassPath and spark.executor.extraClassPath to the path of your MySql driver .jar

    0 讨论(0)
提交回复
热议问题