What is the recommended way to end a spark job inside a conditional statement?
I am doing validation on my data, and if false, I want to end the spark job gracefull
There is nothing to say that you can't call stop
in an if
statement, but there is very little reason to do so and it is probably a mistake to do so. It seems implicit in your question that you may be attempting to open multiple Spark sessions.
The Spark session is intended to be left open for the life of the program - if you try to start two you will find that Spark throws an exception and prints some background including a JIRA ticket that discusses the topic to the logs.
If you wish to run multiple Spark tasks, you may submit them to the same context. One context can run multiple tasks at once.