Optimize read/write huge data (C++)

前端 未结 8 1349
Happy的楠姐
Happy的楠姐 2021-02-06 09:22

I am looking to optimize reading/writing huge data for a C++ simulation application. The data termed as a \"map\" essentially consists of integers, doubles, floats and a single

8条回答
  •  渐次进展
    2021-02-06 09:57

    Since you do not mention an OS that you are running this on, have you looked at memory mapping the file and then using standard memory routines to "walk" the file as you go along?

    This way you are not using fseek/fread instead you are using pointer arithmetic. Here is an mmap example to copy one file from a source file to a destination file. This may improve the performance.

    Other things you could look into, is splitting the files up into smaller files, and using a hash value corresponding to the time unit to close then open the next file to continue the simulation, this way dealing with smaller files that can be more aggressively cached by the host OS!

提交回复
热议问题