How long does RDD remain in memory?

前端 未结 4 1166
半阙折子戏
半阙折子戏 2021-01-05 09:41

Considering memory being limited, I had a feeling that spark automatically removes RDD from each node. I\'d like to know is this time configurable? How does spark decide whe

4条回答
  •  抹茶落季
    2021-01-05 10:20

    Measuring the Impact of GC

    The first step in GC tuning is to collect statistics on how frequently garbage collection occurs and the amount of time spent GC. This can be done by adding -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCTimeStamps to the Java options. (See the configuration guide for info on passing Java options to Spark jobs.) Next time your Spark job is run, you will see messages printed in the worker’s logs each time a garbage collection occurs. Note these logs will be on your cluster’s worker nodes (in the stdout files in their work directories), not on your driver program.

    Advanced GC Tuning

    To further tune garbage collection, we first need to understand some basic information about memory management in the JVM:

    Java Heap space is divided in to two regions Young and Old. The Young generation is meant to hold short-lived objects while the Old generation is intended for objects with longer lifetimes.

    The Young generation is further divided into three regions [Eden, Survivor1, Survivor2].

    A simplified description of the garbage collection procedure: When Eden is full, a minor GC is run on Eden and objects that are alive from Eden and Survivor1 are copied to Survivor2. The Survivor regions are swapped. If an object is old enough or Survivor2 is full, it is moved to Old. Finally when Old is close to full, a full GC is invoked.

提交回复
热议问题