I, thank for your attention.
I want to export a lot of data, really a lot of data (6 million of rows) to a .csv file using java. The app is a swing application, with JPA
The answer is to use a "stream" approach - ie read one row, write one row as you scroll through the dataset. You'll need to get the query result as a cursor and iterate through it, not get the whole result set.
In JPA, use code something like this:
ScrollableResults cursor = session.createQuery("from SomeEntity x").scroll();
while (cursor.next()) {
writeToFile(cursor);
}
This means you only have one row in memory at a time, which is totally scalable to any number of rows and uses minimal memory (it's faster anyway).
Getting all rows at once in a result set is a convenience approach which works for small result set (which is most of the time), but as usual, convenience comes at a cost and it doesn't work in all situations.