My question is very straightforward: what is the best way to apply a custom function to all the columns of a Pyspark dataframe?
I am trying to apply a sum over a window i