How do I generate batches of data using Keras from huge Parquet part files

后端 未结 0 1453
清歌不尽
清歌不尽 2021-02-13 14:41

I have a dataset of 100 Parquet files(each of size ~10 GB) in a directory, on which I want to train a DSSM based model

Now, given the huge amount of data, I am confused a

相关标签:
回答
  • 消灭零回复
提交回复
热议问题