Let\'s say I have a numpy array a that contains the numbers 1-10. So a is [1 2 3 4 5 6 7 8 9 10].
Now, I also have a Python Spark dataframe to which I want to add my num
In scala API, we can use "typedLit" function to add the Array or map values in the column.
// Ref : https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.functions$
Here is the sample code to add an Array or Map as a column value.
import org.apache.spark.sql.functions.typedLit
val df1 = Seq((1, 0), (2, 3)).toDF("a", "b")
df1.withColumn("seq", typedLit(Seq(1,2,3)))
.withColumn("map", typedLit(Map(1 -> 2)))
.show(truncate=false)
// Output
+---+---+---------+--------+
|a |b |seq |map |
+---+---+---------+--------+
|1 |0 |[1, 2, 3]|[1 -> 2]|
|2 |3 |[1, 2, 3]|[1 -> 2]|
+---+---+---------+--------+
I hope this helps.