Optimize read/write huge data (C++)

前端 未结 8 1313
Happy的楠姐
Happy的楠姐 2021-02-06 09:22

I am looking to optimize reading/writing huge data for a C++ simulation application. The data termed as a \"map\" essentially consists of integers, doubles, floats and a single

8条回答
  •  隐瞒了意图╮
    2021-02-06 09:57

    You might consider using memory mapped files. For example see boost::interprocess as they provide a convenient implementation.

    Also you might consider using stlxxl which provides STL like functionality aimed towards large filebased datasets.

    And one more also - if you want iterator like access to your data, then have a look at boost::iterator_facade.

    If you don't want to play with the fancy tricks, you could provide additional binary file containing the index for the file with structures (containing the offsets of the structure starting offsets). This would provide indirect random access.

提交回复
热议问题