How to add map column in spark based on other column?

自古美人都是妖i 提交于 2020-01-03 19:34:10

问题


I have this table:

|Name|Val|
|----|---|
|Bob |1  |
|Marl|3  |

And I want to transform it to a map with single element like this:

|Name|Val|MapVal|
|----|---|------|
|Bob |1  |(0->1)|
|Marl|3  |(0->3)|

Any idea how to do it in scala? I couldn't find any way to build a map in withColumn statement...


回答1:


Found it - Just need to include the implicit sql:

import org.apache.spark.sql.functions._

And then use the map function: df.withColumn("MapVal", map(lit(0), col("Val")))



来源:https://stackoverflow.com/questions/42466708/how-to-add-map-column-in-spark-based-on-other-column

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!