I am trying to execute it via:
Process process = Runtime.getRuntime().exec(spark_cmd);
with no luck. The command ran via shell starts my applic
One way is Spark launcher as told by @Sandeep Purohit
I'd offer shell script approach with nohup command to submit job like this...
This worked for me incase of mapreduce executions... same way you can try for spark background jobs as well.
Have a look https://en.wikipedia.org/wiki/Nohup
"nohup spark-submit <parameters> 2>&1 < /dev/null &
"
When ever, you get messages then you can poll that event and call this shell script. Below is the code snippet to do this...
/**
* This method is to spark submit
* <pre> You can call spark-submit or mapreduce job on the fly like this.. by calling shell script... </pre>
* @param commandToExecute String
*/
public static Boolean executeCommand(final String commandToExecute) {
try {
final Runtime rt = Runtime.getRuntime();
// LOG.info("process command -- " + commandToExecute);
final String[] arr = { "/bin/sh", "-c", commandToExecute};
final Process proc = rt.exec(arr);
// LOG.info("process started ");
final int exitVal = proc.waitFor();
LOG.trace(" commandToExecute exited with code: " + exitVal);
proc.destroy();
} catch (final Exception e) {
LOG.error("Exception occurred while Launching process : " + e.getMessage());
return Boolean.FALSE;
}
return Boolean.TRUE;
}
Moreover to debug
ps -aef | grep "your pid or process name"
Below command will list the open files opened by the process..
lsof -p <your process id >
Also, have a look at process.waitFor() never returns
You can send spark job as spark-submit with the help of java code with the help of SparkLauncher so you can go though below link and check it our
https://spark.apache.org/docs/1.4.0/api/java/org/apache/spark/launcher/SparkLauncher.html