Defining DataFrame Schema for a table with 1500 columns in Spark

后端 未结 3 1464
轻奢々
轻奢々 2021-01-23 02:29

I have a table with around 1500 columns in SQL Server. I need to read the data from this table and then convert it to proper datatype format and then insert the records into Ora

3条回答
  •  离开以前
    2021-01-23 02:44

    The options for reading a table with 1500 columns

    1) Using Case class

    Case class would not work because its limited to 22 fields( for scala version < 2.11).

    2) Using StructType

    You can use the StructType to define the schema and create the dataframe.

    Third option

    You can use the Spark-csv package . In this, you can use .option("inferschema","true"). This will automatically read the schema from the file.

提交回复
热议问题