As the result of some aggregation i come up with following sparkdataframe:
------------+-----------------+-----------------+ |sale_user_id|gross_profit
The easiest way is to cast double column to decimal, giving appropriate precision and scale:
df.withColumn('total_sale_volume', df.total_sale_volume.cast(DecimalType(18, 2)))