PySpark Error during group by and write. Is it because of partitioning?

后端 未结 0 1800
予麋鹿
予麋鹿 2021-02-11 10:00

I am reading some comments from hive data source. You can also look at the physical plan

comments = hive.executeQuery(f"""
    select *
from  mytab         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题