get min and max from a specific column scala spark dataframe

后端 未结 7 1077
梦谈多话
梦谈多话 2021-02-01 04:37

I would like to access to the min and max of a specific column from my dataframe but I don\'t have the header of the column, just its number, so I should I do using scala ?

相关标签:
7条回答
  • 2021-02-01 05:19

    Using spark functions min and max, you can find min or max values for any column in a data frame.

    import org.apache.spark.sql.functions.{min, max}
    
    val df = Seq((5, 2), (10, 1)).toDF("A", "B")
    
    df.agg(max($"A"), min($"B")).show()
    /*
    +------+------+
    |max(A)|min(B)|
    +------+------+
    |    10|     1|
    +------+------+
    */
    
    0 讨论(0)
提交回复
热议问题