hadoop writables NotSerializableException with Apache Spark API

后端 未结 2 1591
遥遥无期
遥遥无期 2021-02-08 07:02

Spark Java application throws NotSerializableException on hadoop writables.

public final class myAPP {
  public static void main(String[] args) throws Exception          


        
2条回答
  •  春和景丽
    2021-02-08 07:08

    As of Spark v1.4.0, you can use this Java API to register classes to be serialized using Kryo: https://spark.apache.org/docs/latest/api/java/org/apache/spark/SparkConf.html#registerKryoClasses(java.lang.Class[]) , by passing in an array of Class objects, each of which can be obtained using http://docs.oracle.com/javase/7/docs/api/java/lang/Class.html#forName(java.lang.String)

    such as:

    new SparkConf().registerKryoClasses(new Class[]{
        Class.forName("org.apache.hadoop.io.LongWritable"),
        Class.forName("org.apache.hadoop.io.Text")
    });
    

    Hope this helps.

提交回复
热议问题