Passing Array to Python Spark Lit Function

后端 未结 2 528
南笙
南笙 2021-02-18 16:48

Let\'s say I have a numpy array a that contains the numbers 1-10. So a is [1 2 3 4 5 6 7 8 9 10].

Now, I also have a Python Spark dataframe to which I want to add my num

2条回答
  •  你的背包
    2021-02-18 16:58

    In scala API, we can use "typedLit" function to add the Array or map values in the column.

    // Ref : https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.functions$

    Here is the sample code to add an Array or Map as a column value.

    import org.apache.spark.sql.functions.typedLit
    
    val df1 = Seq((1, 0), (2, 3)).toDF("a", "b")
    
    df1.withColumn("seq", typedLit(Seq(1,2,3)))
        .withColumn("map", typedLit(Map(1 -> 2)))
        .show(truncate=false)
    

    // Output

    +---+---+---------+--------+
    |a  |b  |seq      |map     |
    +---+---+---------+--------+
    |1  |0  |[1, 2, 3]|[1 -> 2]|
    |2  |3  |[1, 2, 3]|[1 -> 2]|
    +---+---+---------+--------+
    

    I hope this helps.

提交回复
热议问题