How to directly read a huge chunk of memory into std::vector?

爷,独闯天下 提交于 2019-12-04 05:11:24
Some programmer dude

You use the std::vector constructor which sets the size of the vector, and use std::vector::data to get a pointer to allocated memory.

Keeping with your use of fread:

std::vector<T> x(big_n);
fread(x.data(), sizeof(T), big_n, fp);

As noted by others, using fread if the type T is not a POD type will most likely not work. You can then use C++ streams and std::istreambuf_iterator to read the file into the vector. However this have the drawback that it loops over all items in the file, and if big_n is as big as it sounds then this might be a performance problem.


However, if the file truly is big, I rather recommend using memory mapping to read the file.

andre

This will read the file into a vector using

#include <vector>
#include <fstream>
#include<iterator>
// ...

std::ifstream testFile("testfile", std::ios::binary);
std::vector<unsigned char> fileContents((std::istreambuf_iterator<unsigned char>(testFile)),
                           std::istreambuf_iterator<unsigned char>());

This answer comes from a previous answer: https://stackoverflow.com/a/4761779/942596

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!