How to get name of dataframe column in pyspark?

后端 未结 5 1021
深忆病人
深忆病人 2021-02-01 13:44

In pandas, this can be done by column.name.

But how to do the same when its column of spark dataframe?

e.g. The calling program has a spark dataframe: spark_df

5条回答
  •  别那么骄傲
    2021-02-01 14:21

    You can get the names from the schema by doing

    spark_df.schema.names
    

    Printing the schema can be useful to visualize it as well

    spark_df.printSchema()
    

提交回复
热议问题