来源:https://stackoverflow.com/questions/63572210/how-spark-handles-out-of-memory-error-when-cached-memory-only-persistence-data 标签 apache-spark caching out-of-memory rdd partitioning