spring-batch-tasklet

Spring Batch multiple process for heavy load with multiple thread under every process

戏子无情 提交于 2021-02-17 06:43:43
问题 I have a scenario where I need to have roughly 50-60 different process running concurrently and executing a task. Every process must fetch the data from DB using a sql query by passing a value and fetching data to be run against in the subsequent task. select col_1, col_2, col_3 from table_1 where col_1 = :Process_1; @Bean public Job partitioningJob() throws Exception { return jobBuilderFactory.get("parallelJob") .incrementer(new RunIdIncrementer()) .flow(masterStep()) .end() .build(); }

spring batch not processing all records

混江龙づ霸主 提交于 2021-02-04 22:00:58
问题 I am using spring batch to read records from postgresql DB using RepositoryItemReader and then write it to a topic. I see that there were around 1 million records which had to be processed but it didn't process all the records. I have set pageSize for reader as 10,000 and same as commit interval (chunk size) @Bean public TaskletStep broadcastProductsStep(){ return stepBuilderFactory.get("broadcastProducts") .<Product, Product> chunk(10000) .reader(productsReader.repositoryItemReader())

spring batch not processing all records

只愿长相守 提交于 2021-02-04 21:57:18
问题 I am using spring batch to read records from postgresql DB using RepositoryItemReader and then write it to a topic. I see that there were around 1 million records which had to be processed but it didn't process all the records. I have set pageSize for reader as 10,000 and same as commit interval (chunk size) @Bean public TaskletStep broadcastProductsStep(){ return stepBuilderFactory.get("broadcastProducts") .<Product, Product> chunk(10000) .reader(productsReader.repositoryItemReader())

How to call StepExcecutionListener in spring batch with kafka integration?

时光怂恿深爱的人放手 提交于 2021-01-29 06:37:03
问题 Below is the config of job in etl.xml <batch:job id="procuerJob"> <batch:step id="Produce"> <batch:partition partitioner="partitioner"> <batch:handler grid-size="${ partitioner.limit}"></batch:handler> <batch:step> <batch:tasklet> <batch:chunk reader="Reader" writer="kafkaProducer" commit-interval="20000"> </batch:chunk> <batch:listeners> <batch:listener ref="producingListener" /> </batch:listeners> </batch:tasklet> </batch:step> </batch:partition> </batch:step> </batch:job> below is the code

Send data to Spring Batch Item Reader (or Tasklet)

a 夏天 提交于 2020-06-23 14:24:25
问题 I have the following requirement: An endpoint http://localhost:8080/myapp/jobExecution/myJobName/execute wich receives a CSV and use univocity to apply some validations and generate a List of some pojo. Send that list to a Spring Batch Job for some processing. Multiple users could do this. I wanna know if with Spring Batch I can achive this? I was thinking to use a queue, put the data and execute a Job that pull objects from that queue. But how can I be sure that if other person execute the

Spring batch:Test case for Tasklet - Key is not appearing in actual class when it is invoked from Test class

半腔热情 提交于 2020-05-17 05:59:22
问题 I am trying to learn Batch and Tasklet. I am writing a test case for a Tasklet code in spring batch. I am setting a map in my Test class and debug, the actual class is not having the key which I am passing from my test class. MyEventTasklet.java public class MyEventTasklet implements Tasklet { public RepeatStatus execute (StepContribution contribution, ChunkContext chunkContext){ TreeMap<String, Map<Integer, Set<Student>>> studentMap = chunkContext.getStepContext().getJobExecutionContext()

Why is my Spring Batch job not exiting after completion

眉间皱痕 提交于 2020-03-25 18:07:52
问题 My batch job is configured as follows @Bean("MyJob") public Job umpInpatientCensusRptBatchJob(...) throws IOException { return jobBuilderFactory.get( "MyJob" ) .incrementer( new RunIdIncrementer() ) .start( Step0 ).on( COMPLETE ).end() .from( Step0 ).on( CONTINUE ) .to( Step1 ) .next( Step2 ) .next( Step3 ) .end() .build(); } where Steps 0, 1, and 3 are tasklets. My Job is completing with the message Job: [FlowJob: [name=MyJob]] completed with the following parameters . However, it doesn't