Efficient way of reading large txt file in python

后端 未结 3 1946
隐瞒了意图╮
隐瞒了意图╮ 2021-01-27 06:35

I\'m trying to open a txt file with 4605227 rows (305 MB)

The way I have done this before is:

data = np.loadtxt(\'file.txt\', delimiter=\'\\t\', dtype=st         


        
3条回答
  •  清歌不尽
    2021-01-27 07:18

    You read it directly in as a Pandas DataFrame. eg

    import pandas as pd
    pd.read_csv(path)
    

    If you want to read faster, you can use modin:

    import modin.pandas as pd
    pd.read_csv(path)
    

    https://github.com/modin-project/modin

提交回复
热议问题