Burst memory usage in Java

前端 未结 8 2187
臣服心动
臣服心动 2021-02-03 23:57

I am trying to get a handle on proper memory usage and garbage collection in Java. I\'m not a novice programmer by any means, but it always seems to me that once Java touches so

相关标签:
8条回答
  • 2021-02-04 00:33

    ... but it always seems to me that once Java touches some memory, it's gone forever. You will never get it back.

    It depends on what you mean by "gone forever".

    I've also heard it said that some JVMs do give memory back to the OS when they are ready and able to. Unfortunately, given the way that the low-level memory APIs typically work, the JVM has to give back entire segments, and it tends to be complicated to "evacuate" a segment so that it can be given back.

    But I wouldn't rely on that ... because there are various things that could prevent the memory being given back. The chances are that the JVM won't give the memory back to the OS. But it is not "gone forever" in the sense that the JVM will continue to make use of it. Even if the JVM never approaches the peak usage again, all of that memory will help to make the garbage collector run more efficiently.

    In that case, you have to make sure your peak memory is never too high, or your application will continually eat up hundreds of MB of RAM.

    That is not true. Assuming that you are adopting the strategy of starting with a small heap and letting it grow, the JVM won't ask for significantly more memory than the peak memory. The JVM won't continually eat up more memory ... unless your application has a memory leak and (as a result) its peak memory requirement has no bound.

    (The OP's comments below indicate that this is not what he was trying to say. Even so, it is what he did say.)


    On the topic of garbage collection efficiency, we can model the cost of a run of an efficient garbage collector as:

    cost ~= (amount_of_live_data * W1) + (amount_of_garbage * W2)
    

    where W1 and W2 are (we assume) constants that depend on the collector. (Actually, this is an over-simplification. The first part is not a linear function of the number of live objects. However, I claim that it doesn't matter for the following.)

    The efficiency of the collector can then be stated as:

    efficiency = cost / amount_of_garbage_collected
    

    which (if we assume that the GC collects all data) expands to

    efficiency ~= (amount_of_live_data * W1) / amount_of_garbage + W2.
    

    When the GC runs,

    heap_size ~= amount_of_live_data + amount_of_garbage
    

    so

    efficiency ~= W1 * (amount_of_live_data / (heap_size - amount_of_live_data) )
                  + W2.
    

    In other words:

    • as you increase the heap size, the efficiency tends to a constant (W2), but
    • you need a large ratio of heap_size to amount_of_live_data for this to happen.

    The other point is that for an efficient copying collector, W2 covers just the cost of zeroing the space occupied by the garbage objects in 'from space'. The rest (tracing, copying of live objects to 'to space", and zeroing the 'from space' that they occupied) is part of the first term of the initial equation; i.e. covered by W1. What this means is that W2 is likely to be considerably smaller than W1 ... and that the first term of the final equation is significant for longer.

    Now obviously this is a theoretical analysis, and the cost model is a simplification of how real garbage collectors really work. (And it doesn't take account of the "real" work that the application is doing, or the system-level effects of tying down too much memory.) However, the maths tells me that from the standpoint of GC efficiency, a big heap really does help a lot.

    0 讨论(0)
  • 2021-02-04 00:42

    Once the program terminates, is the memory usage getting down in taskmanager in windows ? I think the memory is getting released but not shown as released by the default task monitors in the OS you are monitoring. Go through this question on C++ Problem with deallocating vector of pointers

    0 讨论(0)
提交回复
热议问题