How to submit multiple Spark applications in parallel without spawning separate JVMs?

后端 未结 2 798
一生所求
一生所求 2021-02-09 08:20

The problem is that you need to launch separate JVM to create separate session with different number of RAM per job.

How to submit few Spark applications simultaneously

2条回答
  •  忘了有多久
    2021-02-09 09:08

    tl;dr I'd say it's not possible.

    A Spark application is at least one JVM and it's at spark-submit time when you specify the requirements of the single JVM (or a bunch of JVMs that act like executors).

    If however you want to have different JVM configurations without launching separate JVMs, that does not seem possible (even outside Spark but assuming JVM is in use).

提交回复
热议问题