I need to read a large text file of around 5-6 GB line by line using Java.
How can I do this quickly?
The clear way to achieve this,
For example:
If you have dataFile.txt
on your current directory
import java.io.*;
import java.util.Scanner;
import java.io.FileNotFoundException;
public class readByLine
{
public readByLine() throws FileNotFoundException
{
Scanner linReader = new Scanner(new File("dataFile.txt"));
while (linReader.hasNext())
{
String line = linReader.nextLine();
System.out.println(line);
}
linReader.close();
}
public static void main(String args[]) throws FileNotFoundException
{
new readByLine();
}
}
The output like as below,
For reading a file with Java 8
package com.java.java8;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.stream.Stream;
/**
* The Class ReadLargeFile.
*
* @author Ankit Sood Apr 20, 2017
*/
public class ReadLargeFile {
/**
* The main method.
*
* @param args
* the arguments
*/
public static void main(String[] args) {
try {
Stream<String> stream = Files.lines(Paths.get("C:\\Users\\System\\Desktop\\demoData.txt"));
stream.forEach(System.out::println);
}
catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Look at this blog:
The buffer size may be specified, or the default size may be used. The default is large enough for most purposes.
// Open the file
FileInputStream fstream = new FileInputStream("textfile.txt");
BufferedReader br = new BufferedReader(new InputStreamReader(fstream));
String strLine;
//Read File Line By Line
while ((strLine = br.readLine()) != null) {
// Print the content on the console
System.out.println (strLine);
}
//Close the input stream
fstream.close();
I documented and tested 10 different ways to read a file in Java and then ran them against each other by making them read in test files from 1KB to 1GB. Here are the fastest 3 file reading methods for reading a 1GB test file.
Note that when running the performance tests I didn't output anything to the console since that would really slow down the test. I just wanted to test the raw reading speed.
1) java.nio.file.Files.readAllBytes()
Tested in Java 7, 8, 9. This was overall the fastest method. Reading a 1GB file was consistently just under 1 second.
import java.io..File;
import java.io.IOException;
import java.nio.file.Files;
public class ReadFile_Files_ReadAllBytes {
public static void main(String [] pArgs) throws IOException {
String fileName = "c:\\temp\\sample-1GB.txt";
File file = new File(fileName);
byte [] fileBytes = Files.readAllBytes(file.toPath());
char singleChar;
for(byte b : fileBytes) {
singleChar = (char) b;
System.out.print(singleChar);
}
}
}
2) java.nio.file.Files.lines()
This was tested successfully in Java 8 and 9 but it won't work in Java 7 because of the lack of support for lambda expressions. It took about 3.5 seconds to read in a 1GB file which put it in second place as far as reading larger files.
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.util.stream.Stream;
public class ReadFile_Files_Lines {
public static void main(String[] pArgs) throws IOException {
String fileName = "c:\\temp\\sample-1GB.txt";
File file = new File(fileName);
try (Stream linesStream = Files.lines(file.toPath())) {
linesStream.forEach(line -> {
System.out.println(line);
});
}
}
}
3) BufferedReader
Tested to work in Java 7, 8, 9. This took about 4.5 seconds to read in a 1GB test file.
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
public class ReadFile_BufferedReader_ReadLine {
public static void main(String [] args) throws IOException {
String fileName = "c:\\temp\\sample-1GB.txt";
FileReader fileReader = new FileReader(fileName);
try (BufferedReader bufferedReader = new BufferedReader(fileReader)) {
String line;
while((line = bufferedReader.readLine()) != null) {
System.out.println(line);
}
}
}
You can find the complete rankings for all 10 file reading methods here.
By using the org.apache.commons.io package, it gave more performance, especially in legacy code which uses Java 6 and below.
Java 7 has a better API with fewer exceptions handling and more useful methods:
LineIterator lineIterator = null;
try {
lineIterator = FileUtils.lineIterator(new File("/home/username/m.log"), "windows-1256"); // The second parameter is optionnal
while (lineIterator.hasNext()) {
String currentLine = lineIterator.next();
// Some operation
}
}
finally {
LineIterator.closeQuietly(lineIterator);
}
<!-- https://mvnrepository.com/artifact/commons-io/commons-io -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.6</version>
</dependency>
Once Java 8 is out (March 2014) you'll be able to use streams:
try (Stream<String> lines = Files.lines(Paths.get(filename), Charset.defaultCharset())) {
lines.forEachOrdered(line -> process(line));
}
Printing all the lines in the file:
try (Stream<String> lines = Files.lines(file, Charset.defaultCharset())) {
lines.forEachOrdered(System.out::println);
}