How to avoid running out of memory in high memory usage application? C / C++

前端 未结 15 2294
天命终不由人
天命终不由人 2021-02-19 14:04

I have written a converter that takes openstreetmap xml files and converts them to a binary runtime rendering format that is typically about 10% of the original size. Input file

15条回答
  •  醉话见心
    2021-02-19 14:26

    This is an old question but, since I've recently done the same thing ....

    There is no simple answer. In an ideal world you'd use a machine with huge address space (ie 64 bit), and massive amounts of physical memory. Huge address space alone is not sufficient or it'll just thrash. In that case parse the XML file into a database, and with appropriate queries, pull out what you need. Quite likely this is what OSM itself does (I believe the world is about 330GB).

    In reality I'm still using XP 32bit for reasons of expediency.

    It's a trade off between space and speed. You can do pretty much anything in any amount of memory providing you don't care how long it takes. Using STL structures you can parse anything you want, but you'll soon run out of memory. You can define your own allocators that swap, but again, it'll be inefficient because the maps, vectors, sets etc do not really know what you are doing.

    The only way I found to make it all work in a small footprint on a 32 bit machine was to think very carefully about what I was doing and what was needed when and break the task into chunks. Memory efficient (never uses more than ~100MB) but not massively quick, but then it doesn't matter - how often does one have to parse the XML data?

提交回复
热议问题