Parallel File Processing: What are recommended ways?

妖精的绣舞 提交于 2020-01-03 16:55:36

问题


This is by large combination of design and code problem.

Use Case
- Given many log files in range (2MB - 2GB), I need to parse each of these logs and apply some processing, generate Java POJO.
- For this problem, lets assume that we have just 1 log file
- Also, the idea is to making best use of System. Multiple cores are available.

Alternative 1
- Open file (synchronous), read each line, generate POJOs

FileActor -> read each line -> List<POJO>  

Pros: simple to understand
Cons: Serial Process, not taking advantage of multiple cores in the system

Alternative 2
- Open File (synchronous), read N lines (N is configurable), pass on to different actors to process

                                                    / LogLineProcessActor 1
FileActor -> LogLineProcessRouter (with 10 Actors) -- LogLineProcessActor 2
                                                    \ LogLineProcessActor 10

Pros Some parallelization, by using different actors to process part of lines. Actors will make use of available cores in the system (? how, may be?)
Cons Still Serial, because file read in serial fashion

Questions
- is any of the above choice a good choice?
- Are there better alternatives?

Please provide valuable thoughts here

Thanks a lot


回答1:


Why not take advantage of what's already available, and use the paralell stream stuff, that comes with jdk 1.8? I would start with something like this, and see how it performs:

Files.lines(Paths.get( /* path to a log file */ ))
     .parallel() // make the stream work paralell
     .map(YourBean::new) // Or some mapping method to your bean class
     .forEach(/* process here the beans*/);

You may need some tweaks with the thread pooling, because paralell() by default is executed using ForkJoinPool.commonPool(), and you can't really customize it to achieve maximum performance, but people seem to find workarounds for that too, some stuff about the topic here.




回答2:


The alternative 2 looks good. I would just change a thing. Read the biggest chunk of file you can. IO will be a problem if you do it in small bursts. As there are several files, I would create an actor to get the name of the files, reading a particular folder. Then it will send the path to each file to the LogLineReader. It will read a big chunk of the file. And finally it will send each line to the LogLineProcessActor. Be aware that they may process the lines out of order. If that is not a problem, they will keep your CPU busy.

If you feel adventurous, you could also try the new akka stream 1.0.



来源:https://stackoverflow.com/questions/30109438/parallel-file-processing-what-are-recommended-ways

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!