The problem is that you need to launch separate JVM to create separate session with different number of RAM per job.
How to submit few Spark applications simultaneously
tl;dr I'd say it's not possible.
A Spark application is at least one JVM and it's at spark-submit
time when you specify the requirements of the single JVM (or a bunch of JVMs that act like executors).
If however you want to have different JVM configurations without launching separate JVMs, that does not seem possible (even outside Spark but assuming JVM is in use).