How do I decrease the memory used by a large list in python

后端 未结 2 849
别跟我提以往
别跟我提以往 2021-01-15 01:33

I\'m writing a program, it works fine but when it loads the database(a 100MB text file) to a list it\'s memory usage becomes 700-800MB

Code used to load the file to

相关标签:
2条回答
  • 2021-01-15 01:53

    As long as you don't need the complete file in memory, you could read one line at a time:

    database = []
    db = open('database/db.hdb')
    line = db.readline()
    while line:
        line = line.split(':')
        database.append(line)
        line = db.readline()
    

    See here for details on file.readline()

    0 讨论(0)
  • 2021-01-15 02:09

    You should use the file object as an iterator to reduce memory usage from file. You could then process database list in chunks rather than all together. For example:

    results = []
    database = []
    for line in open("database/db.hdb"):
        line = line.split(':')
        #You could then manage database in chunks?
        database.append(line)
        if len(database) > MAX:
            #dosomething with database list so far to get result
            results.append(process_database(database))
            database = []
    #do something now with individual results to make one result
    combine_results(results)
    
    0 讨论(0)
提交回复
热议问题