I\'m writing a program, it works fine but when it loads the database(a 100MB text file) to a list it\'s memory usage becomes 700-800MB
Code used to load the file to
As long as you don't need the complete file in memory, you could read one line at a time:
database = []
db = open('database/db.hdb')
line = db.readline()
while line:
line = line.split(':')
database.append(line)
line = db.readline()
See here for details on file.readline()
You should use the file object as an iterator to reduce memory usage from file. You could then process database list in chunks rather than all together. For example:
results = []
database = []
for line in open("database/db.hdb"):
line = line.split(':')
#You could then manage database in chunks?
database.append(line)
if len(database) > MAX:
#dosomething with database list so far to get result
results.append(process_database(database))
database = []
#do something now with individual results to make one result
combine_results(results)