How to get name of dataframe column in pyspark?

后端 未结 5 1019
深忆病人
深忆病人 2021-02-01 13:44

In pandas, this can be done by column.name.

But how to do the same when its column of spark dataframe?

e.g. The calling program has a spark dataframe: spark_df

5条回答
  •  梦谈多话
    2021-02-01 14:07

    I found the answer is very very simple...

    // It is in java, but it should be same in pyspark
    Column col = ds.col("colName"); //the column object
    String theNameOftheCol = col.toString();
    

    The variable "theNameOftheCol" is "colName"

提交回复
热议问题