Python: Memory usage and optimization when modifying lists

前端 未结 7 1079
失恋的感觉
失恋的感觉 2021-02-04 03:28

The problem

My concern is the following: I am storing a relativity large dataset in a classical python list and in order to process the data I must iterate over the li

7条回答
  •  执念已碎
    2021-02-04 04:14

    Python stores only references to objects in the list - not the elements themselves. If you grow a list item by item, the list (that is the list of references to the objects) will grow one by one, eventually reaching the end of the excess memory that Python preallocated at the end of the list (of references!). It then copies the list (of references!) into a new larger place while your list elements stay at their old location. As your code visits all the elements in the old list anyway, copying the references to a new list by new_list[i]=old_list[i] will be nearly no burden at all. The only performance hint is to allocate all new elements at once instead of appending them (OTOH the Python docs say that amortized append is still O(1) as the number of excess elements grows with the list size). If you are lacking the place for the new list (of references) then I fear you are out of luck - any data structure that would evade the O(n) in-place insert/delete will likely be bigger than a simple array of 4- or 8-byte entries.

提交回复
热议问题