In pandas, this can be done by column.name.
But how to do the same when its column of spark dataframe?
e.g. The calling program has a spark dataframe: spark_df>
Python
As @numeral correctly said, column._jc.toString()
works fine in case of unaliased columns.
In case of aliased columns (i.e. column.alias("whatever")
) the alias can be extracted, even without the usage of regular expressions: str(column).split(" AS ")[1].split("`")[1]
.
I don't know Scala syntax, but I'm sure It can be done the same.