How to Join Multiple Columns in Spark SQL using Java for filtering in DataFrame

前端 未结 2 682
梦谈多话
梦谈多话 2021-02-08 19:07
  • DataFrame a = contains column x,y,z,k
  • DataFrame b = contains column x,y,a

    a.join(b,
    
            
2条回答
  •  南方客
    南方客 (楼主)
    2021-02-08 19:59

    Spark SQL provides a group of methods on Column marked as java_expr_ops which are designed for Java interoperability. It includes and (see also or) method which can be used here:

    a.col("x").equalTo(b.col("x")).and(a.col("y").equalTo(b.col("y"))
    

提交回复
热议问题