How to calculate rolling sum with varying window sizes in PySpark
问题 I have a spark dataframe that contains sales prediction data for some products in some stores over a time period. How do I calculate the rolling sum of Predictions for a window size of next N values? Input Data +-----------+---------+------------+------------+---+ | ProductId | StoreId | Date | Prediction | N | +-----------+---------+------------+------------+---+ | 1 | 100 | 2019-07-01 | 0.92 | 2 | | 1 | 100 | 2019-07-02 | 0.62 | 2 | | 1 | 100 | 2019-07-03 | 0.89 | 2 | | 1 | 100 | 2019-07-04