Java multithreading reading a single large file

前端 未结 4 1287
长发绾君心
长发绾君心 2021-02-09 06:44

What is an efficient way for a Java multithreaded application where many threads have to read the exact same file (> 1GB in size) and expose it as an input stream? I\'ve notice

4条回答
  •  梦毁少年i
    2021-02-09 06:46

    It seems to me that you're going to have to load the file into memory if you want to avoid IO contention. The operating system will do some buffering, but if you're finding that's not enough, you're going to have to do it yourself.

    Do you really need 32 threads though? Presumably you don't have nearly that many cores - so use fewer threads and you'll get less context switching etc.

    Do your threads all process the file from start to finish? If so, could you effectively split the file into chunks? Read the first (say) 10MB of data into memory, let all the threads process it, then move on to the next 10MB etc.

    If that doesn't work for you, how much memory do you have compared with the size of the file? If you have plenty of memory but you don't want to allocate one huge array, you could read the whole file into memory, but into lots of separate smaller byte arrays. You'd then have to write an input stream which spans all of those byte arrays, but that should be doable.

提交回复
热议问题