Java: Efficiently converting an array of longs to an array of bytes

后端 未结 3 535
后悔当初
后悔当初 2021-01-21 20:22

I have an array of longs I want to write to disk. The most efficient disk I/O functions take in byte arrays, for example:

FileOutputStream.write(by         


        
相关标签:
3条回答
  • 2021-01-21 20:50

    No, there is not a trivial way to convert from a long[] to a byte[].

    Your best option is likely to wrap your FileOutputStream with a BufferedOutputStream and then write out the individual byte values for each long (using bitwise operators).

    Another option is to create a ByteBuffer and put your long values into the ByteBuffer and then write that to a FileChannel. This handles the endianness conversion for you, but makes the buffering more complicated.

    0 讨论(0)
  • 2021-01-21 20:50

    OP here.

    I have thought of one approach: ByteBuffer.asLongBuffer() returns an instance of ByteBufferAsLongBufferB, a class which wraps ByteBuffer in an interface for treating the data as longs while properly managing endianness. I could extend ByteBufferAsLongBufferB, and add a method to return the raw byte buffer (which is protected).

    But this seems so esoteric and convoluted I feel there must be an easier way. Either that, or something in my approach is flawed.

    0 讨论(0)
  • 2021-01-21 20:59

    Concerning the efficiency, many details will, in fact, hardly make a difference. The hard disk is by far the slowest part involved here, and in the time that it takes to write a single byte to the disk, you could have converted thousands or even millions of bytes to longs. Every performance test here will not tell you anything about the performance of the implementation, but about the performance of the hard disk. In doubt, one should make dedicated benchmarks comparing the different conversion strategies, and comparing the different writing methods, respectively.

    Assuming that the main goal is a functionality that allows a convenient conversion and does not impose an unnecessary overhead, I'd like to propose the following approach:

    One can create a ByteBuffer of sufficient size, view this as a LongBuffer, use the bulk LongBuffer#put(long[]) method (which takes care of endianness conversions, of necessary, and does this as efficient as it can be), and finally, write the original ByteBuffer (which is now filled with the long values) to the file, using a FileChannel.

    Following this idea, I think that this method is convenient and (most likely) rather efficient:

    private static void bulkAndChannel(String fileName, long longArray[]) 
    {
        ByteBuffer bytes = 
            ByteBuffer.allocate(longArray.length * Long.BYTES);
        bytes.order(ByteOrder.nativeOrder()).asLongBuffer().put(longArray);
        try (FileOutputStream fos = new FileOutputStream(fileName))
        {
            fos.getChannel().write(bytes);
        }
        catch (IOException e)
        {
            e.printStackTrace();
        }
    }
    

    (Of course, one could argue about whether allocating a "large" buffer is the best idea. But thanks to the convenience methods of the Buffer classes, this could easily and with reasonable effort be modified to write "chunks" of data with an appropriate size, for the case that one really wants to write a huge array and the memory overhead of creating the corresponding ByteBuffer would be prohibitively large)

    0 讨论(0)
提交回复
热议问题