R Shiny and Spark: how to free Spark resources?

后端 未结 1 1628
梦谈多话
梦谈多话 2020-12-21 08:09

Say we have a Shiny app which is deployed on a Shiny Server. We expect that the app will be used several users via their web browser, as usual.

相关标签:
1条回答
  • 2020-12-21 08:57

    TL;DR SparkSession and SparkContext are not lightweight resources which can be started on demand.

    Putting aside all security considerations related to starting Spark session directly from a user-facing application, maintaining SparkSession inside server (starting session on entry, stopping on exit) is simply not a viable option.

    server function will be executed every time there is an upcoming event effectively restarting a whole Spark application, and rendering project unusable. And this only the tip of the iceberg. Since Spark reuses existing sessions (only one context is allowed for a single JVM), multiuser access could lead to random failures if reused session has been stopped from another server call.

    One possible solution is to register onSessionEnded with spark_disconnect, but I am pretty sure it will be useful only in a single user environment.

    Another possible approach is to use global connection, and wrap runApp with function calling spark_disconnect_all on exit:

    runApp <- function() {
      shiny::runApp()
      on.exit({
        spark_disconnect_all()
      })
    }
    

    although in practice resource manager should free resources when driver disassociates, without stopping session explicitly.

    0 讨论(0)
提交回复
热议问题