I wrote simple program in spark to write a dataframe to table in mySql.
The program is as follows:
import org.apache.spark.SparkConf
import org.apache.sp
Eliasah was right. M2Eclipse does create a jar file but it's not a fat/uber jar. If I explicitly install the "maven assembly" plugin on the eclipse, I am able to create a fat jar with the dependency jars included and hence the program runs.
This is because your driver isn't present in the uber-jar that you are submitting to the cluster whether it's a standalone cluster or yarn or mesos, etc.
Solution 1 : Since you are using maven, you can use the assembly plugin to build your uber-jar with all the needed dependencies. More information about maven assembly plugin here.
Solution 2 : Provide these dependency libraries on runtime when you submit your application using the --jars
option. I advice your to read ore information about advanced dependencies management and submitting applications in the official documentation.
e.g it can look like this :
./bin/spark-submit \
--class <main-class>
--master <master-url> \
--jars /path/to/mysql-connector-java*.jar
I hope this helps !