Is there any size limit for Spark-Dataframe to process/hold columns at a time?

♀尐吖头ヾ 提交于 2021-01-29 03:30:53

问题


I would like to know, whether Spark-Dataframe has a limitation on Column Size ? Like max columns that can be processed/hold by a Dataframe at a time is less than 500. I am asking because, while parsing a xml with less than 500 tags, I can process and generate a corresponding parquet file successfully, but if it is more than 500, then the parquet getting generated is empty. Any idea on this ?

来源:https://stackoverflow.com/questions/38696047/is-there-any-size-limit-for-spark-dataframe-to-process-hold-columns-at-a-time

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!