Spark DataFrame groupBy and sort in the descending order (pyspark)

后端 未结 5 1139
我在风中等你
我在风中等你 2021-01-30 07:46

I\'m using pyspark(Python 2.7.9/Spark 1.3.1) and have a dataframe GroupObject which I need to filter & sort in the descending order. Trying to achieve it via this piece of c

5条回答
  •  滥情空心
    2021-01-30 08:36

    In pyspark 2.4.4

    1) group_by_dataframe.count().filter("`count` >= 10").orderBy('count', ascending=False)
    
    2) from pyspark.sql.functions import desc
       group_by_dataframe.count().filter("`count` >= 10").orderBy('count').sort(desc('count'))
    

    No need to import in 1) and 1) is short & easy to read,
    So I prefer 1) over 2)

提交回复
热议问题