问题
I have a binary that contains a list of short strings which is loaded on startup and stored in memory as a map from string to protobuf (that contains the string..). (Not ideal, but hard to change that design due to legacy issues) Recently that list has grown from ~2M to ~20M entries causing it to fail when constructing the map.
First I got OutOfMemoryError: Java heap space
.
When I increased the heap size using the xms and xmx we ran into GC overhead limit exceeded
.
Runs on a Linux 64-bit machine with 15GB available memory and the following JVM args (I increased the RAM 10G->15G and the heap flags 6000M -> 9000M):
-Xms9000M -Xmx9000M -XX:PermSize=512m -XX:MaxPermSize=2018m
This binary does a whole lot of things and is serving live traffic so I can't afford it being occasionally stuck.
Edit: I eventually went and did the obvious thing, which is fixing the code (change from HashMap to ImmutableSet) and adding more RAM (-Xmx11000M).
回答1:
I'm looking for a temporary solution if that's possible until we have a more scalable one.
First, you need to figure out if the "OOME: GC overhead limit exceeded" is due to the heap being:
too small ... causing the JVM to do repeated Full GCs, or
too large ... causing the JVM to thrash the virtual memory when a Full GC is run.
You should be able to distinguish these two cases by turning on and examining the GC logs, and using OS-level monitoring tools to check for excessive paging loads. (When checking the paging levels, also check that the problem isn't due to competition for RAM between your JVM and another memory-hungry application.)
If the heap is too small, try making it bigger. If it is too big, make it smaller. If you system is showing both symptoms ... then you have a big problem.
You should also check that "compressed oops" is enabled for your JVM, as that will reduce your JVM's memory footprint. The -XshowSettings
option lists the settings in effect when the JVM starts. Use -XX:+UseCompressedOops
to enable compressed oops if they are disabled.
(You will probably find that compressed oops are enabled by default, but it is worth checking. This would be an easy fix ... )
If none of the above work, then your only quick fix is to get more RAM.
But obviously, the real solution is to reengineer the code so that you don't need a huge (and increasing over time) in-memory data structure.
来源:https://stackoverflow.com/questions/35574184/jvm-issues-with-a-large-in-memory-object