When to use Kryo serialization in Spark?

后端 未结 3 1807
梦谈多话
梦谈多话 2021-02-19 13:17

I am already compressing RDDs using conf.set(\"spark.rdd.compress\",\"true\") and persist(MEMORY_AND_DISK_SER). Will using Kryo serialization make the

相关标签:
3条回答
  • 2021-02-19 13:39

    Both of the RDD states you described (compressed and persisted) use serialization. When you persist an RDD, you are serializing it and saving it to disk (in your case, compressing the serialized output as well). You are right that serialization is also used for shuffles (sending data between nodes): any time data needs to leave a JVM, whether it's going to local disk or through the network, it needs to be serialized.

    Kryo is a significantly optimized serializer, and performs better than the standard java serializer for just about everything. In your case, you may actually be using Kryo already. You can check your spark configuration parameter:

    "spark.serializer" should be "org.apache.spark.serializer.KryoSerializer".

    If it's not, then you can set this internally with:

    conf.set( "spark.serializer", "org.apache.spark.serializer.KryoSerializer" )
    

    Regarding your last question ("is it even needed?"), it's hard to make a general claim about that. Kryo optimizes one of the slow steps in communicating data, but it's entirely possible that in your use case, others are holding you back. But there's no downside to trying Kryo and benchmarking the difference!

    0 讨论(0)
  • 2021-02-19 13:50

    Kryo serialization is a more optimized serialization technique so you can use it to serialize any class which is used in an RDD or Dataframe closure. For some specific information use of Kryo serialization, see below:

    1. Use when serializing third party non-serialize classes inside an RDD or dataframe closure
    2. You want to use efficient serialization technique
    3. If you ever got a serialization error because of some class, you can register that class with the Kryo serializer
    0 讨论(0)
  • 2021-02-19 13:57

    Considering another point: kyro is faster than the default in serialization and deserialization, so it's better to use kyro. But the performance increase may be not as good as said, there are other points which will influence the program speed, like how you write your spark code, which lib you choose.

    0 讨论(0)
提交回复
热议问题