I ran a heap dump on my program. When I opened it in the memory analyzer tool, I found that the java.lang.ref.Finalizer
for org.logicalcobwebs.proxool.Pr
Some classes implement the Object.finalize()
method. Objects which override this method need to called by a background thread call finalizer, and they can't be cleaned up until this happens. If these tasks are short and you don't discard many of these it all works well. However if you are creating lots of these objects and/or their finalizers take a long time, the queue of objects to be finalized builds up. It is possible for this queue to use up all the memory.
The solution is
The last option is likely to be best for you as you are using an existing library.
From what I can make out, Proxool is a connection pool for JDBC connections. This suggests to me that the problem is that your application is is misusing the connection pool. Instead of calling close
on the statement objects, your code is probably dropping them and/or their parent connections. The Proxool is relying on finalizers to close the underlying driver-implemented objects ... but this requires those Finalizer instances. It could also mean that you are causing the connection to open / close (real) database connections more frequently than is necessary, and that would be bad for performance.
So I suggest that you check your code for leaked ResultSet, Statement and/or Connection objects, and make sure that you close them in finally
blocks.
Looking at the memory dump, I expect you are concerned where the 898,527,228 bytes are going. The vast majority are retained by the Finalizer object whose id is 2aab07855e38
. If you still have the dump file, take a look at what that Finalizer
is referring to. It looks more problematic than the Proxool objects.