Optimize read/write huge data (C++)

前端 未结 8 1317
Happy的楠姐
Happy的楠姐 2021-02-06 09:22

I am looking to optimize reading/writing huge data for a C++ simulation application. The data termed as a \"map\" essentially consists of integers, doubles, floats and a single

相关标签:
8条回答
  • 2021-02-06 09:57

    You might consider using memory mapped files. For example see boost::interprocess as they provide a convenient implementation.

    Also you might consider using stlxxl which provides STL like functionality aimed towards large filebased datasets.

    And one more also - if you want iterator like access to your data, then have a look at boost::iterator_facade.

    If you don't want to play with the fancy tricks, you could provide additional binary file containing the index for the file with structures (containing the offsets of the structure starting offsets). This would provide indirect random access.

    0 讨论(0)
  • 2021-02-06 10:04

    Frameworks like Boost and ACE provide platform independent access to memory mapped files. That should speed up your parsing significantly.

    0 讨论(0)
提交回复
热议问题