import numpy as np df = spark.createDataFrame( [(1, 1, None), (1, 2, float(5)), (1, 3, np.nan), (1, 4, None),
You should be using the when (with otherwise) function:
when
otherwise
from pyspark.sql.functions import when targetDf = df.withColumn("timestamp1", \ when(df["session"] == 0, 999).otherwise(df["timestamp1"]))