Spark SQL filter multiple fields

前端 未结 1 1243
滥情空心
滥情空心 2020-12-17 15:12

What is the corrent syntax for filtering on multiple columns in the Scala API? If I want to do something like this:

dataFrame.filter($\"col01\" === \"somethi         


        
相关标签:
1条回答
  • 2020-12-17 15:49

    I think i see what the issue is. For some reason, spark does not allow two !='s in the same filter. Need to look at how filter is defined in Spark source code.

    Now for your code to work, you can use this to do the filter

    df.filter(col("item").notEqual("") && col("group").notEqual("-1"))
    

    or use two filters in same statement

    df.filter($"item" !== "").filter($"group" !== "-1").select(....)
    

    This link here can help with different spark methods.

    0 讨论(0)
提交回复
热议问题