I\'m trying to read/write a huge text file. But when I try to do that I get the error:
Exception in thread \"main\" java.lang.OutOfMemoryError: Java heap space
To read a huge file in Java you should either user java.util.scanner or apache commons LineIterator. Both approaches don't load whole file in memory and read file line by line. Am able to read file of size greater than 1gb using LineIterator. Please visit this link for more details http://www.baeldung.com/java-read-lines-large-file and example.
Don't try to read large files into memory. They don't fit. Find a way of processing the file a line at a time, or a record at a time, or a chunk at a time. I can't see any reason here why you can't do that.
Calling File.exists()
and File.createNewFile()
immediately before constructing a FileWriter
around the same File
is a complete waste of time.
I tried to add a counter (count) so it can flush the buffer after a certain amount of lines read. It didn't work. I know the counter does not work correctly. It doesn't goes to zero After a special number of execution of "while" loop. I added a "for" loop before and after while loop to empty the counter but that didn't work as well.
Any Suggestion?
The out of memory error is because your file is so huge that all the contents of that file could not be read into your local variable contents
in the function getContents(File aFile)
.
Flushing the buffer has nothing to do with it. Using a PrintWriter instead of a BufferedWriter may help clean up the code a bit. By using PrintWriter, you wouldn't have to do something like :
bw.write(content);
bw.newLine();
You can change this to :
printWriter.println(content);
You also forgot to tell us your use-case. In the end, all you do is print all the contents of the file. You could have done this line by line.
Try using a FileInputStream instead of a BufferedReader/Writer. When I used a FileInputStream, I could copy a dummy log file that had over 36 MILLION lines and was almost 500MB in size in less than a few seconds.
FileInputStream in = new FileInputStream(from); //Read data from a file
FileOutputStream out = new FileOutputStream(to); //Write data to a file
byte[] buffer = new byte[4096]; //Buffer size, Usually 1024-4096
int len;
while ((len = in.read(buffer, 0, buffer.length)) > 0) {
out.write(buffer, 0, len);
}
//Close the FileStreams
in.close();
out.close();
if you wanted to read the file line by line instead of chunks of bytes, you could use a BufferedReader, but in a different way.
// Removed redundant exists()/createNewFile() calls altogether
String line;
BufferedReader br = new BufferedReader(new FileReader(aFile));
BufferedWriter output = new BufferedWriter(new FileWriter(file, true));
while ((line = br.readLine()) != null) {
String modified1 = line.substring(2,17);
String modified2 = line.substring(18,33);
String modified3 = line.substring(40);
String result = "empty";
result = modified1 + ",," +modified2 + modified3;
System.out.println (result);
output.append(result + "\n");//Use \r\n for Windows EOL
}
//Close Streams
br.close();
output.close();
Like EJP said, don't read an entire file into memory - that's not a smart thing to do at all. Your best bet would be to read each line one-by-one or to read chunks of a file at once - although, for accuracy, reading it line-by-line might be best.
During the while ((line = br.readLine()) != null)
, you should do what would have needed the entire file loaded in there while only 1 line is loaded into the memory. (Such as checking if a line contains _ or grabbing text from it).
Another thing you could try to do to avoid the OOM exception is to use multiple Strings.
if(contents.length() => (Integer.MAX_VALUE-5000)) { //-5000 to give some headway when checking
. . .
}