Convert Spark DataFrame to Pojo Object

后端 未结 2 1847
半阙折子戏
半阙折子戏 2021-01-03 05:16

Please see below code:

    //Create Spark Context
    SparkConf sparkConf = new SparkConf().setAppName(\"TestWithObjects\").setMaster(\"local\");
    JavaSpa         


        
相关标签:
2条回答
  • 2021-01-03 05:28

    A DataFrame is stored as Rows, so you can use the methods there to cast from untyped to typed. Take a look at the get methods.

    0 讨论(0)
  • 2021-01-03 05:36

    DataFrame is simply a type alias of Dataset[Row] . These operations are also referred as “untyped transformations” in contrast to “typed transformations” that come with strongly typed Scala/Java Datasets.

    The conversion from Dataset[Row] to Dataset[Person] is very simple in spark

    DataFrame result = sQLContext.sql("SELECT * FROM peoples WHERE name='test'");

    At this point, Spark converts your data into DataFrame = Dataset[Row], a collection of generic Row object, since it does not know the exact type.

    // Create an Encoders for Java beans
    Encoder<Person> personEncoder = Encoders.bean(Person.class); 
    Dataset<Person> personDF = result.as(personEncoder);
    personDF.show();
    

    Now, Spark converts the Dataset[Row] -> Dataset[Person] type-specific Scala / Java JVM object, as dictated by the class Person.

    Please refer to below link provided by databricks for further details

    https://databricks.com/blog/2016/07/14/a-tale-of-three-apache-spark-apis-rdds-dataframes-and-datasets.html

    0 讨论(0)
提交回复
热议问题