Execute spark-submit programmatically from java

前端 未结 2 1301

I am trying to execute it via:

Process process = Runtime.getRuntime().exec(spark_cmd);

with no luck. The command ran via shell starts my applic

2条回答
  •  无人共我
    2021-01-20 23:52

    One way is Spark launcher as told by @Sandeep Purohit

    I'd offer shell script approach with nohup command to submit job like this...

    This worked for me incase of mapreduce executions... same way you can try for spark background jobs as well.

    Have a look https://en.wikipedia.org/wiki/Nohup "nohup spark-submit 2>&1 < /dev/null &"

    When ever, you get messages then you can poll that event and call this shell script. Below is the code snippet to do this...

    /**
         * This method is to spark submit
         * 
     You can call spark-submit or mapreduce job on the fly like this.. by calling shell script... 
    * @param commandToExecute String */ public static Boolean executeCommand(final String commandToExecute) { try { final Runtime rt = Runtime.getRuntime(); // LOG.info("process command -- " + commandToExecute); final String[] arr = { "/bin/sh", "-c", commandToExecute}; final Process proc = rt.exec(arr); // LOG.info("process started "); final int exitVal = proc.waitFor(); LOG.trace(" commandToExecute exited with code: " + exitVal); proc.destroy(); } catch (final Exception e) { LOG.error("Exception occurred while Launching process : " + e.getMessage()); return Boolean.FALSE; } return Boolean.TRUE; }

    Moreover to debug

    ps -aef | grep "your pid or process name"
    

    Below command will list the open files opened by the process..

    lsof -p  
    

    Also, have a look at process.waitFor() never returns

提交回复
热议问题