Fastest way of reading relatively huge byte-files in Java

后端 未结 3 1200
抹茶落季
抹茶落季 2020-11-30 22:59

what\'s the probably fastest way of reading relatively huge files with Java\'s I/O-methods? My current solution uses the BufferedInputStream saving to an byte-a

相关标签:
3条回答
  • 2020-11-30 23:13

    I would use a memory mapped file which is fast enough to do in the same thread.

    final FileChannel channel = new FileInputStream(fileName).getChannel();
    MappedByteBuffer buffer = channel.map(FileChannel.MapMode.READ_ONLY, 0, channel.size());
    
    // when finished
    channel.close();
    

    This assumes the file is smaller than 2 GB and will take 10 milli-seconds or less.

    0 讨论(0)
  • 2020-11-30 23:26

    Have a look at Java NIO (Non-Blocking Input/Output) API. Also, this question might prove being useful.

    I don't have much experience with IO, but I've heard that NIO is much more efficient way of handling large sets of data.

    0 讨论(0)
  • 2020-11-30 23:29

    Don't use available(): it's not reliable. And don't ignore the result of the read() method: it tells you how many bytes were actually read. And if you want to read everything in memory, use a ByteArrayOutputStream rather than using a List<byte[]>:

    ByteArrayOutputStream baos = new ByteArrayOutputStream();
    int read;
    while ((read = reader.read(buffer)) >= 0) {
        baos.write(buffer, 0, read);
    }
    byte[] everything = baos.toByteArray();
    

    I think 1024 is a bit small as a buffer size. I would use a larger buffer (something like 16 KB or 32KB)

    Note that Apache commons IO and Guava have utility methods that do this for you, and have been optimized already.

    0 讨论(0)
提交回复
热议问题