问题
I am trying to process collection of heavy weight elements (images). Size of collection varies between 8000 - 50000 entries. But for some reason after processing 1800-1900 entries my program falls with java.lang.OutOfMemoryError: Java heap space.
In my understanding each time when I call session.getTransaction().commit() program should free heap memory, but looks like it never happens. What do I do wrong? Here is the code:
private static void loadImages( LoadStrategy loadStrategy ) throws IOException {
log.info( "Loading images for: " + loadStrategy.getPageType() );
Session session = sessionFactory.openSession();
session.setFlushMode( FlushMode.COMMIT );
Query query = session.createQuery( "from PageRaw where pageType = :pageType and pageStatus = :pageStatus and sessionId = 1" );
query.setString( "pageStatus", PageStatus.SUCCESS.name() );
query.setString( "pageType", loadStrategy.getPageType().name() );
query.setMaxResults( 50 );
List<PageRaw> pages;
int resultNum = 0;
do {
session.getTransaction().begin();
log.info( "Get pages statring form " + resultNum + " position" );
query.setFirstResult( resultNum );
resultNum += 50;
pages = query.list();
log.info( "Found " + pages.size() + " pages" );
for (PageRaw pr : pages ) {
Set<String> imageUrls = new HashSet<>();
for ( UrlLocator imageUrlLocator : loadStrategy.getImageUrlLocators() ) {
imageUrls.addAll(
imageUrlLocator.locateUrls( StringConvector.toString( pr.getSourceHtml() ) )
);
}
removeDeletedImageRaws( pr.getImages(), imageUrls );
loadNewImageRaws( pr.getImages(), imageUrls );
}
session.getTransaction().commit();
} while ( pages.size() > 0 );
session.close();
}
回答1:
You have confused flushing with clearing:
flushing a session executes all pending statements against the database (it synchronizes the in-memory state with the database state);
clearing a session purges the session (1st-level) cache, thus freeing memory.
So you need to both flush and clear a session in order to recover the occupied memory.
In addition to that, you must disable the 2nd-level cache. Otherwise all (or most of) the objects will remain reachable even after clearing the session.
回答2:
I don't know why you think committing a transaction frees heap memory. Running garbage collection does that.
OOM error can happen if your perm gen is exhausted.
The easy answer is to change your min and max heap sizes and perm gen size when you start the JVM and see if it goes away.
I'd recommending getting a profiler, like VisualVM, and seeing what is consuming your memory at runtime. It should be easy to fix.
I'd guess that you're trying to commit too large a chunk at once. Break it up into smaller pieces and see if that helps.
回答3:
Try using session.clear() which "Completely clear the session. Evict all loaded instances and cancel all pending saves, updates and deletions. Do not close open iterators or instances of ScrollableResults"
回答4:
This article solved my issue
Session session = sessionFactory.getCurrentSession();
ScrollableResults scrollableResults = session.createQuery("from DemoEntity").scroll(ScrollMode.FORWARD_ONLY);
int count = 0;
while (scrollableResults.next()) {
if (++count > 0 && count % 100 == 0) {
System.out.println("Fetched " + count + " entities");
}
DemoEntity demoEntity = (DemoEntity) scrollableResults.get()[0];
//Process and write result
session.evict(demoEntity);//important to add this
}
}
bulk fetching hibernate
- Use hibernate ScrollableResult
- Use evict
BTW I tried the statless solution it gave me this exception i did not how to solve ( may be you can improve this answer) Exception details is here
org.hibernate.SessionException: collections cannot be fetched by a stateless session
So i tuned with sleep(delay) as its a back ground process and very long with low resources on server i have to cool down cpu; with midnight work ( none rush hours).
来源:https://stackoverflow.com/questions/20869473/hibernate-out-of-memory-exception-while-processing-large-collection-of-elements