How to avoid running out of memory in high memory usage application? C / C++

前端 未结 15 2205
天命终不由人
天命终不由人 2021-02-19 14:04

I have written a converter that takes openstreetmap xml files and converts them to a binary runtime rendering format that is typically about 10% of the original size. Input file

15条回答
  •  谎友^
    谎友^ (楼主)
    2021-02-19 14:30

    On 32-bit XP your maximum program address space is 2GB. Then you have fragmentation due to DLL's and drivers loading up in to your address space. Finally, you have the problem of your heap fragmenting.

    Your best move is just to get it over with and run as a 64-bit process (on a 64-bit system). Suddenly all these problems go away. You can use a better heap to mitigate heap fragmentation effects, and you can try using VirtualAlloc to grab your memory in one big contiguous chunk (and then you get to manage it from there!) to discourage DLL's/drivers from fragmenting it.

    Finally, you can split your BSP across processes. Complicated and painful, and frankly just putting it on disk would be easier, but in theory you could get better performance by having a group of processes exchanging information, if you can keep everything resident (and assuming you can be smarter than memory than the OS can handle file buffering... which is a big if). Each process would need far less memory and therefore shouldn't run in to the 2GB address space limit. Of course, you'll burn through RAM/swap a lot faster.

    You can mitigate the effects of fragmentation of the address space by allocating smaller chunks. This will have other nasty side effects, but you could follow a backoff policy where you grab smaller and smaller chunks of memory if you fail to successfully allocate. Frequently this simple approach will get you a program that works when it otherwise wouldn't, but the rest of the time performs as well as it could.

    Boy, doesn't 64-bit computing just sound so much nicer than the other choices?

提交回复
热议问题