What is the corrent syntax for filtering on multiple columns in the Scala API? If I want to do something like this:
dataFrame.filter($\"col01\" === \"somethi
I think i see what the issue is. For some reason, spark does not allow two !='s in the same filter. Need to look at how filter is defined in Spark source code.
Now for your code to work, you can use this to do the filter
df.filter(col("item").notEqual("") && col("group").notEqual("-1"))
or use two filters in same statement
df.filter($"item" !== "").filter($"group" !== "-1").select(....)
This link here can help with different spark methods.