Very large array on the heap (Visual C++)

丶灬走出姿态 提交于 2019-12-02 02:27:35

If you are using a 32-bit application then by default you have just 2GB of user address space. 400 million integers is about 1.5GB. You are very likely not to have this much contiguous address space. It is possible to force 32-bit windows to allocate a 3GB user address space for each process but this may just be a stop gap for your situation.

If you can move to a 64-bit architecture then this should not be an issue; otherwise you should find a way of storing your matrix data in a way that does not require a single block of contiguous storage, for example storing it in chunks.

I think what you need is a Divide-and-Conquer algorithm. Not memory space.

I'm not sure if in you're case it wouldn't even be better to use STXXL.

Perhaps sparse matrices are of use in your application. This concept is used when dealing with big matrices which have a lot of 0 entries, which can be the case in quite a lot of applications.

And by the way, you do not gain anything by storing such a huge amount of data on the heap. Consider, that your CPU cache has perhaps 12 MB! At least use some intelligent dynamic memory allocation mechanism.

Does the whole array really needs to be allocated ? do you really use the whole array ? Is it an array with lots of 0 ? if it is the case, then the fact that it works better on linux can be explained.

In that case using a sparse array might be more appropriate. Using an existing sparse array implementation would reduce the memory footprint and maybe allow faster computation.

user445106

I just found a very simple solution but i don't know if it is advisable

int tab[400000000]={0};//global array

int main(array<System::String ^> ^args)
{
std::cout<<tab[399999999]<<std::endl;//ok

/*
int* tab=new int[400000000];//doesn't work
...
delete[] tab;
*/
    return 0;
}
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!