Quickly reading very large tables as dataframes

后端 未结 11 1769
清歌不尽
清歌不尽 2020-11-21 04:46

I have very large tables (30 million rows) that I would like to load as a dataframes in R. read.table() has a lot of convenient features, but it seems like the

11条回答
  •  时光说笑
    2020-11-21 05:30

    I've tried all above and [readr][1] made the best job. I have only 8gb RAM

    Loop for 20 files, 5gb each, 7 columns:

    read_fwf(arquivos[i],col_types = "ccccccc",fwf_cols(cnpj = c(4,17), nome = c(19,168), cpf = c(169,183), fantasia = c(169,223), sit.cadastral = c(224,225), dt.sitcadastral = c(226,233), cnae = c(376,382)))
    

提交回复
热议问题