Deploy Apache Spark application from another application in Java, best practice

后端 未结 3 1015
有刺的猬
有刺的猬 2021-02-10 04:37

I am a new user of Spark. I have a web service that allows a user to request the server to perform a complex data analysis by reading from a database and pushing the results bac

相关标签:
3条回答
  • 2021-02-10 05:00

    I've had a similar requirement. Here's what I did:

    1. To submit apps, I use the hidden Spark REST Submission API: http://arturmkrtchyan.com/apache-spark-hidden-rest-api

    2. Using this same API you can query status for a Driver or you can Kill your Job later

    3. There's also another hidden UI Json API: http://[master-node]:[master-ui-port]/json/ which exposes all information available on the master UI in JSON format.

    Using "Submission API" I submit a driver and using the "Master UI API" I wait until my Driver and App state are RUNNING

    0 讨论(0)
  • 2021-02-10 05:09

    We are using Spark Job-server and it is working fine with Java also just build jar of Java code and wrap it with Scala to work with Spark Job-Server.

    0 讨论(0)
  • 2021-02-10 05:15

    The web server can also act as the Spark driver. So it would have a SparkContext instance and contain the code for working with RDDs.

    The advantage of this is that the Spark executors are long-lived. You save time by not having to start/stop them all the time. You can cache RDDs between operations.

    A disadvantage is that since the executors are running all the time, they take up memory that other processes in the cluster could possibly use. Another one is that you cannot have more than one instance of the web server, since you cannot have more than one SparkContext to the same Spark application.

    0 讨论(0)
提交回复
热议问题