Dask killedworker while reading and saving large .csv files

前端 未结 0 2017
北荒
北荒 2021-02-04 17:03

I have around 1.5 TB of data divided into around 5500 json files, that I need to process (NN search) using map_partition and save the results. (GCS). Each .json file has size b

相关标签:
回答
  • 消灭零回复
提交回复
热议问题