General strategies for memory/speed problems

后端 未结 4 1746
春和景丽
春和景丽 2021-01-21 23:05

I have a c++ code which runs through about 200 ASCII files, does some basic data processing, and outputs a single ASCII file with (basically) all of the data.

The progra

4条回答
  •  悲&欢浪女
    2021-01-21 23:55

    Could you share your program ?

    1. One thing to look for is whether you are using data structures that do not scale with increasing number of elements.

    e.g. using lists to hold a million elements would be extremely slow to traverse/ search (O(n)) as opposed to say using a binary search tree (nlog(n)) or a hashing (O(1)).

    2. You should look at whether you are holding on to the data at the end of each cycle (/burn/run) . Ideally you should release all the resources at the end of each run.

    3. Sounds like there may be a handle leak ?

提交回复
热议问题