How do I pass program-argument to main function in running spark-submit with a JAR?

后端 未结 4 830
悲哀的现实
悲哀的现实 2020-12-29 05:28

I know this is a trivial question, but I could not find the answer on the internet.

I am trying to run a Java class with the main function with program

相关标签:
4条回答
  • 2020-12-29 06:15

    Arguments passed before the .jar file will be arguments to the JVM, where as arguments passed after the jar file will be passed on to the user's program.

    bin/spark-submit --class classname -Xms256m -Xmx1g something.jar someargument
    

    Here, s will equal someargument, whereas the -Xms -Xmx is passed into the JVM.

    public static void main(String[] args) {
    
        String s = args[0];
    }
    
    0 讨论(0)
  • 2020-12-29 06:16
    spark-submit --class SparkWordCount --master yarn --jars <jar1.jar>,<jar2.jar>
    sparkwordcount-1.0.jar /user/user01/input/alice.txt /user/user01/output
    
    0 讨论(0)
  • 2020-12-29 06:19

    I found the correct command from this tutorial.

    The command should be of the form:

    bin/spark-submit --class full.package.name.ClassName analytics-package.jar someargument someArgument
    
    0 讨论(0)
  • 2020-12-29 06:25

    The first unrecognized argument is treated as the primaryResource (jar file in our case). Checkout SparkSubmitArguments.handleUnknown

    All the arguments after the primaryResource as treated as arguments to the application. Checkout SparkSubmitArguments.handleExtraArgs

    To better understand how the arguments are parsed, checkout SparkSubmitOptionParser.parse. The above 2 methods are called from this method

    0 讨论(0)
提交回复
热议问题