Spark 1.6.1, Scala api.
For a dataframe, I need to replace all null value of a certain column with 0. I have 2 ways to do this. 1.
myDF.withColumn(\"pipC
There are not the same but performance should be similar. na.fill
uses coalesce
but it replaces NaN
and NULLs
not only NULLS
.
val y = when($"x" === 0, $"x".cast("double")).when($"x" === 1, lit(null)).otherwise(lit("NaN").cast("double"))
val df = spark.range(0, 3).toDF("x").withColumn("y", y)
df.withColumn("y", when($"y".isNull(), 0.0).otherwise($"y")).show()
df.na.fill(0.0, Seq("y")).show()