How to find pyspark dataframe memory usage?

后端 未结 4 424
情深已故
情深已故 2021-02-03 12:29

For python dataframe, info() function provides memory usage. Is there any equivalent in pyspark ? Thanks

4条回答
  •  野的像风
    2021-02-03 13:21

    You can persist dataframe in memory and take action as df.count(). You would be able to check the size under storage tab on spark web ui.. let me know if it works for you.

提交回复
热议问题