How to check if the value at hand is in a particular column of some PySpark dataframe?

前端 未结 1 1383
执笔经年
执笔经年 2021-01-25 22:31

I have a PySpark dataframe, trips, on which I am performing aggregations. For each PULocationID, I am first computing the average of total_amount

相关标签:
1条回答
  • Here is the line of code that solved the problem:

    cnt_cond(col('DOLocationID').isin([i['DOLocationID'] for i in mtrips.collect()])).alias('trips_to_pop')
    
    0 讨论(0)
提交回复
热议问题