How to deal with Spark UDF input/output of primitive nullable type

后端 未结 3 2024
走了就别回头了
走了就别回头了 2021-01-03 02:38

The issues:

1) Spark doesn\'t call UDF if input is column of primitive type that contains null:

inputDF.show()

+-----+
|  x  |
+-----+
         


        
相关标签:
3条回答
  • 2021-01-03 02:55

    Accordingly to the docs:

    Note that if you use primitive parameters, you are not able to check if it is null or not, and the UDF will return null for you if the primitive input is null. Use boxed type or [[Option]] if you wanna do the null-handling yourself.


    So, the easiest solution is just to use boxed types if your UDF input is nullable column of primitive type OR/AND you need to output null from UDF as a column of primitive type:

    inputDF
      .withColumn("y",
         udf { (x: java.lang.Double) => 
           (if (x == null) 1 else null): java.lang.Integer
         }.apply($"x")
      )
      .show()
    
    +-----+-----+
    |  x  |  y  |
    +-----+-----+
    | null| null|
    |  1.0|  2.0|
    +-----+-----+
    
    0 讨论(0)
  • 2021-01-03 02:56

    Based on the solution provided at SparkSQL: How to deal with null values in user defined function? by @zero323, an alternative way to achieve the requested result is:

    import scala.util.Try
    val udfHandlingNulls udf((x: Double) => Try(2.0).toOption)
    inputDF.withColumn("y", udfHandlingNulls($"x")).show()
    
    0 讨论(0)
  • 2021-01-03 03:03

    I would also use Artur's solution, but there is also another way without using javas wrapper classes by using struct:

    import org.apache.spark.sql.functions.struct
    import org.apache.spark.sql.Row
    
    inputDF
      .withColumn("y",
         udf { (r: Row) => 
           if (r.isNullAt(0)) Some(1) else None
         }.apply(struct($"x"))
      )
      .show()
    
    0 讨论(0)
提交回复
热议问题