问题
I would like to know, whether Spark-Dataframe has a limitation on Column Size ? Like max columns that can be processed/hold by a Dataframe at a time is less than 500. I am asking because, while parsing a xml with less than 500 tags, I can process and generate a corresponding parquet file successfully, but if it is more than 500, then the parquet getting generated is empty. Any idea on this ?
来源:https://stackoverflow.com/questions/38696047/is-there-any-size-limit-for-spark-dataframe-to-process-hold-columns-at-a-time