Partitioned Job can't stop by itself after finishing? Spring Batch

后端 未结 3 1273
無奈伤痛
無奈伤痛 2021-01-03 01:49

I wrote a Job of two Steps, with one of two steps being a partitioning step. The partition step uses TaskExecutorPartitionHandler and runs 5 slave steps in threads. The job

相关标签:
3条回答
  • 2021-01-03 02:14

    All of the above answers are hack/work around. Root cause of the issue posted in the question is that the threadPoolTaskExecutor doesn't shares the lifecycle of the step. Hence while destroying the step/job context , the threadpool is not destroyed automatically and it is running forever. Bringing the threadPoolExecutor within the the stepContext "@StepScope" should do the trick. Spring takes care of destroying it.

    @Bean @StepScope public ThreadPoolTaskExecutor threadPoolTaskExecutor() {

    0 讨论(0)
  • 2021-01-03 02:19

    There are 2 solutions to your problem, although I don't know the cause.

    First, you can use a CommandLineJobRunner to launch the Job. See documentation here. This class automatically exits the program at the end of the job and converts the ExitStatus to a return code (COMPLETED = 0, FAILED = 1...). The default return code are provided by a SimpleJvmExitCodeMapper.

    The second solution would be to manually call a System.exit() instruction after your JobLauncher.run(). You can also convert the ExitStatus of the Job manually and use it in your manual exit :

    // Create Job
    JobLauncher jobLauncher = (JobLauncher) context.getBean("jobLauncher");
    Job job = (Job) context.getBean(jobName);
    
    // Create return codes mapper
    SimpleJvmExitCodeMapper mapper = new SimpleJvmExitCodeMapper();
    
    // Start Job
    JobExecution execution = jobLauncher.run(job, new JobParameters());
    
    // Close context
    context.close();
    
    // Map codes and exit
    String status = execution.getExitStatus().getExitCode();
    Integer returnCode = mapper.intValue(status);
    System.exit(returnCode);
    
    0 讨论(0)
  • 2021-01-03 02:23

    I also had difficulty with my partitioned Spring batch application hanging on completion when I used a ThreadPoolTaskExecutor. In addition, I saw that the executor was not allowing the work of all the partitions to finish.

    I found two ways of solving those issues.

    The first solution is using a SimpleAsyncTaskExecutor instead of a ThreadPoolTaskExecutor. If you do not mind the extra overhead in re-creating threads, this is a simple fix.

    The second solution is creating a JobExecutionListener that calls shutdown on the ThreadPoolTaskExecutor.

    I created a JobExecutionListener like this:

    @Bean
    public JobExecutionListener jobExecutionListener(ThreadPoolTaskExecutor executor) {
        return new JobExecutionListener() {
            private ThreadPoolTaskExecutor taskExecutor = executor;
            @Override
            public void beforeJob(JobExecution jobExecution) {
    
            }
    
            @Override
            public void afterJob(JobExecution jobExecution) {
                taskExecutor.shutdown();
            }
        };
    }
    

    and added it to my Job definition like this:

    @Bean
    public Job partitionedJob(){
        return jobBuilders.get("partitionedJob")
                .listener(jobExecutionListener(taskExecutor()))
                .start(partitionedStep())
                .build();
    }
    
    0 讨论(0)
提交回复
热议问题