in my unit test I deliberately trying to raise an OutOfMemoryError exception. I use a simple statement like the following:
byte[] block = new byte[128 * 1024
You could deliberately set the maximum heap size of your JVM to a small amount by using the -Xmx
flag.
Launch the following program:
public final class Test {
public static void main(final String[] args) {
final byte[] block = new byte[Integer.MAX_VALUE];
}
}
with the following JVM argument: -Xmx8m
That will do the trick:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at Test.main(Test.java:4)
The reason for no OutofMemoryError is that the memory is being allocated in a uncommitted state, with no page.
If you write a non-zero byte into each 4K of the array, that will then cause the memory to be allocated.
Linux doesn't always allocate you all the memory you ask for immediately, since many real applications ask for more than they need. This is called overcommit (it also means sometimes it guesses wrong, and the dreaded OOM killer strikes).
For your unittest, I would just throw OutOfMemoryError manually.
Minor point but allocating new long[Integer.MAX_VALUE] will use up memory 8x faster. (~16 GB each)
ulimit -v 102400
ulimit -d 102400
unitTest.sh
The above should limit your unit test to 1M of virtual memory, and 1M data segment size. When you reach either of those, your process should get ENOMEM. Careful, these restrictions apply for the process / shell where you called them exits; you might want to run them in a subshell.
man 2 setrlimit
for details on how that works under the hood. help ulimit
for the ulimit command.
128*1024*1024*1024=0 because int is 32-bit. Java doesn't support arrays larger than 4Gb.