Title says it all.
Basically, I am getting tired of having to reset my comp every time I mistakenly make MATLAB use a large amount of RAM for a simulation with many par
You can set virtual memory quota for a process group. On Windows use a Job object. On *nix use ulimit
. This works with any process, not just MatLab.
See
The problem you see occurs when Matlab starts to use virtual memory. You should normally be able to kill the Matlab process via the Task Manager, but that's not always desirable. There is no simple Matlab-internal switch that will globally limit the maximum array size, unfortunately.
What you can do is to make the swap file size very small, so that Matlab can't really write much to it, but this may in turn affect the performance of other programs. Other, non-Matlab solutions would be to switch to Linux (where you can set memory limits for a program more easily, see @BenVoigt's answer for details on setting limits on both Windows and Linux), or to run everything in a virtual machine.
For future reference, in my simulations, I have a method (subfunction, if you don't want to do it OOP) at the beginning of my pre-allocation that calculates the estimated total memory usage given the simulation parameters (# of elements of all the large arrays I'll use times 8 for doubles is memory in bytes), and that throws an error when would use too much RAM.
Here's an example for a quick memory check. I know that I'm going to allocate 3 m-by-3-by-t arrays, and 5 m-by-t arrays, all of them double.
maxMemFrac = 0.8; %# I want to use at most 80% of the available memory
numElements = 3 * (m * 3 * t) + 5 * (m * t);
numBytesNeeded = numElements * 8; %# I use double
%# read available memory
[~,memStats] = memory;
if numBytesNeeded > memStats.PhysicalMemory.Available * maxMemFrac
error('MYSIM:OUTOFMEMORY','too much memory would be needed')
end