Optimize read/write huge data (C++)

前端 未结 8 1328
Happy的楠姐
Happy的楠姐 2021-02-06 09:22

I am looking to optimize reading/writing huge data for a C++ simulation application. The data termed as a \"map\" essentially consists of integers, doubles, floats and a single

8条回答
  •  长情又很酷
    2021-02-06 09:43

    The effectiveness of this idea depends on your pattern of access, but if you are not looking at that variable size data each cycle, you might speed up access by rearranging your file structure:
    Instead of writing a direct dump of a structure like this:

    struct { 
      int x;
      enum t;
      int sz
      char variable_data[sz];
    };
    

    you could write all the fixed size parts up front, then store the variable portions afterward:

    struct {
      int x;
      enum t;
      int sz;
      long offset_to_variable_data;
    };
    

    Now, as you parse the file each cycle, you can linearly read N records at a time. You will only have to deal with fseek when you need to fetch the variable-sized data. You might even consider keeping that variable portion in a separate file so that you also only read forward through that file.

    This strategy may even improve your performance if you do go with a memory-mapped file as others suggested.

提交回复
热议问题