Under what circumstances does Java performance degrade with more memory?

微笑、不失礼 提交于 2020-01-01 06:11:24

问题


We're load testing a Java 1.6 application in our DEV environment. The JVM heap allocation is 2Gb, -Xms2048m -Xmx2048m. Under load testing, the app runs smooth, never uses more than 1.25Gb of heap, and garbage collection is totally normal.

In our UAT environment, we run the load test with the same parameters, the only difference is the JVM, it's allocated 4Gb, -Xms4096m -Xmx4096m, otherwise, the hardware is exactly the same with DEV. But during load testing, the performance is horrendous, the app eats up nearly the entire heap, and garbage collection runs rampant.

We've run these tests over and over again, eliminated all possible symptoms that may influence performance, but the results are the same. Under what circumstances can this be the case?


回答1:


The culprit could be garbage collection, normal "stop-the-world"-type collection caused us some performance problems; the server-software was running very slow, yet the load of the server was also low. Eventually we found out that there was a single "stop-the-world" -garbage collector thread holding up the entire software being run all the time under certain scenarios (operations producing loads of garbage).

Moving to concurrent garbage collection alleviated the problem with start up parameters -XX:+UseParallelOldGC -XX:ParallelGCThreads=8. We were using "only" 2gb heaps in tests and production, but it is also worthy of noting that the amount of time the GC takes goes up with larger heap (even if your software never actually uses all of it).

You might want to read more about different garbage collector -options and tuning from here: Java SE 6 HotSpot[tm] Virtual Machine Garbage Collection Tuning.

Also, answers in this question could provide some help: Java very large heap sizes.




回答2:


There is something different about your application in the Production and UAT environments.

Judging from the symptoms, it is (IMO) unlikely to be a hardware, operating system performance tuning or a difference in the JVM versions. It goes without saying that this is unlikely to be due to the application having more memory.

(It is not inconceivable that your application might do something strange ... like sizing some data structures based on the maximum heap size and get the calculations wrong. But I think you'd be aware of that possibility, so lets ignore it for now.)

It is probably related to a difference in the OS environment; e.g. a different version of the OS or some application, differences in the networking, differences in locales, etcetera. But the bottom line is that it is 99% certain that there is a memory leak in your application when run on the UAT, and that memory leak is what is chewing up heap memory and overloading the GC.

My advice would be to treat this as a storage leak problem, and use the standard tools / techniques to track down the cause of the problem. In the process, you will most likely be able to figure out why this only occurs on your UAT.




回答3:


It will be worth while to analyze the heap dumps on both these machines and understand what is consuming the heap differently on these 2 environments. Histograms will help.



来源:https://stackoverflow.com/questions/12208337/under-what-circumstances-does-java-performance-degrade-with-more-memory

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!