How far will Spark RDD cache go?

前端 未结 1 1750
一个人的身影
一个人的身影 2021-02-05 19:42

Say I have three RDD transformation function called on rdd1:

def rdd2 = rdd1.f1
def rdd3 = rdd2.f2
def rdd4 = rdd3.f3

Now I w

相关标签:
1条回答
  • 2021-02-05 20:07

    The whole idea of cache is that spark is not keeping the results in memory unless you tell it to. So if you cache the last RDD in the chain it only keeps the results of that one in memory. So, yes, you do need to cache them separately, but keep in mind you only need to cache an RDD if you are going to use it more than once, for example:

    rdd4.cache()
    val v1 = rdd4.lookup("key1")
    val v2 = rdd4.lookup("key2")
    

    If you do not call cache in this case rdd4 will be recalculated for every call to lookup (or any other function that requires evaluation). You might want to read the paper on RDD's it is pretty easy to understand and explains the ideas behind certain choices they made regarding how RDD's work.

    0 讨论(0)
提交回复
热议问题