Convert Row to map in spark scala

前端 未结 4 934
傲寒
傲寒 2021-02-04 18:23

I have a row from a data frame and I want to convert it to a Map[String, Any] that maps column names to the values in the row for that column.

Is there an easy way to do

4条回答
  •  囚心锁ツ
    2021-02-04 18:48

    You can use getValuesMap:

    val df = Seq((1, 2.0, "a")).toDF("A", "B", "C")    
    val row = df.first
    

    To get Map[String, Any]:

    row.getValuesMap[Any](row.schema.fieldNames)
    // res19: Map[String,Any] = Map(A -> 1, B -> 2.0, C -> a)
    

    Or you can get Map[String, AnyVal] for this simple case since the values are not complex objects

    row.getValuesMap[AnyVal](row.schema.fieldNames)
    // res20: Map[String,AnyVal] = Map(A -> 1, B -> 2.0, C -> a)
    

    Note: the returned value type of the getValuesMap can be labelled as any type, so you can not rely on it to figure out what data types you have but need to keep in mind what you have from the beginning instead.

提交回复
热议问题