In pandas, this can be done by column.name.
But how to do the same when its column of spark dataframe?
e.g. The calling program has a spark dataframe: spark_df>
I found the answer is very very simple...
// It is in java, but it should be same in pyspark Column col = ds.col("colName"); //the column object String theNameOftheCol = col.toString();
The variable "theNameOftheCol" is "colName"