I have a number of csv files (90+) that are too large for memory (~0.5gb each zipped) which all have same schema. I want to convert to parquet and then use dask for time ser