java : writing large files?

放肆的年华 提交于 2019-12-04 05:24:12

If you really insist using Java for this, then the best way would be to write immediately as soon as the data comes in and thus not to collect all the data from ResultSet into Java's memory first. You would need at least that much of free memory in Java otherwise.

Thus, do e.g.

while (resultSet.next()) {
    writer.write(resultSet.getString("columnname"));
    // ...
}

That said, most decent DB's ships with builtin export-to-CSV capabilities which are undoubtely way more efficient than you could ever do in Java. You didn't mention which one you're using, but if it was for example MySQL, you could have used the LOAD DATA INFILE for this. Just refer the DB-specific documentation. Hope this gives new insights.

The default buffer size for a BufferedWriter is 8192. If you are going to be writing squigabyte files, you might want to increase this using the 2 argument constructor; e.g.

int buffSize = ... // 1 megabyte or so
BufferedWriter mbrWriter = new BufferedWriter(new FileWriter(memberCSV), buffSize);

This should reduce the number of syscalls needed to write the file.

But I doubt that this would make more than a couple of percent difference. Pulling rows from the resultset will probably be the main performance bottleneck. For significant improvements in performance you'd need to use the database's native bulk export facilities.

Im not 100% sure, but it appears tha BufferedReader loads the data into a Buffer in the RAM. Java can use 128mb Ram (unless otherwise specified), so the BufferedReader will likely overflow java's memory causing an error. Try using InputStreamReader and FileInputStream to read and then store the data in a char, then just write that char using a FileOutputStream.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!