问题
In the job I read from a file and store something in a database. I would like to have many running jars of the batch job in different processes and partition the data from the file among the running instances.
I would also like to be able to keep adding files to be processed and also distribute the reads from those.
I read spring xd might be a good fit, but can't find good tutorials on it.
YES I am also a noob of spring batch and xd.
回答1:
The first thing to understand is how to remotely partition batch jobs. See the batch documentation for Spring Batch Integration and its support for remote partitioning, based on basic batch partitioning.
Spring XD provides out-of-the-box support for single-step partitioned work-loads.
You just have to import singlestep-partition-support.xml
and provide partitioner and tasklet beans. See the XD Documentation for an example.
来源:https://stackoverflow.com/questions/33023555/how-do-you-distribute-a-spring-batch-job-effectively-across-jvms