What is use of method addJar() in Spark?

后端 未结 1 624
执念已碎
执念已碎 2021-01-23 02:33

In spark job, I don\'t know how to import and use the jars that is shared by method SparkContext.addJar(). It seems that this method is able to move jars into some

相关标签:
1条回答
  • 2021-01-23 03:16

    Did you try set the path of jar with prefix "local"? From documentation:

    public void addJar(String path)
    

    Adds a JAR dependency for all tasks to be executed on this SparkContext in the future. The path passed can be either a local file, a file in HDFS (or other Hadoop-supported filesystems), an HTTP, HTTPS or FTP URI, or local:/path for a file on every worker node.

    You can try this way as well:

    val conf = new SparkConf()
                 .setMaster('local[*]')
                 .setAppName('tmp')
                 .setJars(Array('/path1/one.jar', '/path2/two.jar'))
    
    val sc = new SparkContext(conf)
    

    and take a look here, check spark.jars option

    and set "--jars" param in spark-submit:

    --jars /path/1.jar,/path/2.jar
    

    or edit conf/spark-defaults.conf:

    spark.driver.extraClassPath /path/1.jar:/fullpath/2.jar
    spark.executor.extraClassPath /path/1.jar:/fullpath/2.jar
    
    0 讨论(0)
提交回复
热议问题