I have a spring batch job that reads files matching naming pattern from a directory, does some processing, and writes back status of processing per line in input file. The w
I think this solution can be produce wrong output because MultiResourceItemReader
reads from first file a bunch of items and if this set doesn't match your completion-policy it reads from next resource; in this case you will have items from two different sources and itemReader
bean is not synched to write to right output resource.
You can try to solve let a.b.c.d.MyObject
implements org.springframework.batch.item.ResourceAware and write a custom multi-resource item writer and manage resource open/close when resource change (this solution is not trivial because requires state save for restartability, of course).
class MyItemWriter implements ItemWriter<MyObject>,ItemStream {
private FlatFileItemWriter delegate;
private String resourceName;
public open(ExecutionContext e) {
resourceName = e.getString("resourceName");
delegate.open(e);
}
public close(ExecutionContext e) {
e.putString("resourceName",resourceName);
delegate.close(e);
}
public update(ExecutionContext e) {
e.putString("resourceName",resourceName);
delegate.update(e);
}
public void write(List<? extends MyObject> items) {
for(final MyObject o : items) {
String itemResource = o.getResource().toString();
if(!itemResource.equals(resourceName)) {
closeDelegate();
openDelegate();
}
writeToDelegate(o);
}
}
}
Of course,this is not tested and it's just an idea.
Another way is quite different and simulate a loop using a decider in this way:
ResourceHolder
, for example) used to store resources from file:#{jobParameters['cwd']}/#{jobParameters['inputFolder']}/file-*.txt
; this bean is not stored into execution context but filled with a JobExecutionListener at every job start because execution contexts should be stay small.<batch:job id="job">
<batch:decision decider="decider" id="decider">
<batch:next on="CONTINUABLE" to="readWriteStep" />
<batch:end on="FINISHED" />
</batch:decision>
<batch:step id="readWriteStep" reader="itemReader" write="itemWriter" next="decider" />
<batch:listeners>
<batch:listener class="ResourceHolderFiller">
<property name="resources" value="file:#{jobParameters['cwd']}/#{jobParameters['inputFolder']}/file-*.txt" />
<property name="resourceHolderBean" ref="resourceHolderBean" />
</batch:listener>
</batch:listeners>
</batch:job>
<bean id="resourceHolder" class="ResourceHolderBean" />
<bean id="decider" class="MyDecider">
<property name="resourceHolderBean" ref="resourceHolderBean" />
</bean>
class ResourceHolderBean
{
Resource[] resource;
public void setResource(Resource[] resource) {...}
public Resource[] getResource() {...}
}
class MyDecider implements JobExecutionDecider {
ResourceHolderBean resourceBean;
public void setResourceBean(ResourceHolderBean resBean){...}
FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) {
Integer resourceIndex = jobExecution.getExecutionContext().getInt("resourceIndex");
if(resourceIndex == null)
{
resourceIndex = 0;
}
if(resourceIndex == resourceBean.getResources().size()) {
return new FlowExecutionStatus(RepeatStatus.FINISHED.name());
}
else {
jobExecution.getExecutionContext().putInt("resourceIndex",resourceIndex);
// This is the key your reader/writer must use as inpu/output filename
jobExecution.getExecutionContext().putString("resourceName",resourceBean.getResources()[resourceIndex].toString());
}
}
}
ResourceHolderFiller
is a job listener that injects resources into ResourceHolderBean bean.
Hope that can help.