Say we have a Shiny app which is deployed on a Shiny Server. We expect that the app will be used several users via their web browser, as usual.
TL;DR SparkSession
and SparkContext
are not lightweight resources which can be started on demand.
Putting aside all security considerations related to starting Spark session directly from a user-facing application, maintaining SparkSession
inside server (starting session on entry, stopping on exit) is simply not a viable option.
server
function will be executed every time there is an upcoming event effectively restarting a whole Spark application, and rendering project unusable. And this only the tip of the iceberg. Since Spark reuses existing sessions
(only one context is allowed for a single JVM), multiuser access could lead to random failures if reused session has been stopped from another server
call.
One possible solution is to register onSessionEnded with spark_disconnect
, but I am pretty sure it will be useful only in a single user environment.
Another possible approach is to use global connection, and wrap runApp
with function calling spark_disconnect_all
on exit:
runApp <- function() {
shiny::runApp()
on.exit({
spark_disconnect_all()
})
}
although in practice resource manager should free resources when driver disassociates, without stopping session explicitly.