How to java-configure separate datasources for spring batch data and business data? Should I even do it?

后端 未结 7 1527
感情败类
感情败类 2020-11-29 22:33

My main job does only read operations and the other one does some writing but on MyISAM engine which ignores transactions, so I wouldn\'t require necessarily tr

相关标签:
7条回答
  • 2020-11-29 22:50

    Have you tried something like this already?

    @Bean(name="batchDataSource")
    public DataSource batchDataSource(){          
           return DataSourceBuilder.create()
                    .url(env.getProperty("batchdb.url"))
                    .driverClassName(env.getProperty("batchdb.driver"))
                    .username(env.getProperty("batchdb.username"))
                    .password(env.getProperty("batchdb.password"))
                    .build();          
    } 
    

    and then mark the other datasource with a @Primary, and use an @Qualifier in your batch config to specify that you want to auotwire the batchDataSource bean.

    0 讨论(0)
  • 2020-11-29 22:55

    I have my data sources in a separate configuration class. In the batch configuration, we extend DefaultBatchConfigurer and override the setDataSource method, passing in the specific database to use with Spring Batch with a @Qualifier. I was unable to get this to work using the constructor version, but the setter method worked for me.

    My Reader, Processor, and Writer's are in their own self contained classes, along with the steps.

    This is using Spring Boot 1.1.8 & Spring Batch 3.0.1. Note: We had a different setup for a project using Spring Boot 1.1.5 that did not work the same on the newer version.

    package org.sample.config.jdbc;
    
    import javax.sql.DataSource;
    
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.context.annotation.Bean;
    import org.springframework.context.annotation.Configuration;
    import org.springframework.context.annotation.Primary;
    import org.springframework.core.env.Environment;
    
    import com.atomikos.jdbc.AtomikosDataSourceBean;
    import com.mysql.jdbc.jdbc2.optional.MysqlXADataSource;
    
    /**
     * The Class DataSourceConfiguration.
     *
     */
    @Configuration
    public class DataSourceConfig {
    
        private final static Logger log = LoggerFactory.getLogger(DataSourceConfig.class);
    
        @Autowired private Environment env;
    
        /**
         * Siphon data source.
         *
         * @return the data source
         */
        @Bean(name = "mainDataSource")
        @Primary
        public DataSource mainDataSource() {
    
            final String user = this.env.getProperty("db.main.username");
            final String password = this.env.getProperty("db.main.password");
            final String url = this.env.getProperty("db.main.url");
    
            return this.getMysqlXADataSource(url, user, password);
        }
    
        /**
         * Batch data source.
         *
         * @return the data source
         */
        @Bean(name = "batchDataSource", initMethod = "init", destroyMethod = "close")
        public DataSource batchDataSource() {
    
            final String user = this.env.getProperty("db.batch.username");
            final String password = this.env.getProperty("db.batch.password");
            final String url = this.env.getProperty("db.batch.url");
    
            return this.getAtomikosDataSource("metaDataSource", this.getMysqlXADataSource(url, user, password));
        }
    
        /**
         * Gets the mysql xa data source.
         *
         * @param url the url
         * @param user the user
         * @param password the password
         * @return the mysql xa data source
         */
        private MysqlXADataSource getMysqlXADataSource(final String url, final String user, final String password) {
    
            final MysqlXADataSource mysql = new MysqlXADataSource();
            mysql.setUser(user);
            mysql.setPassword(password);
            mysql.setUrl(url);
            mysql.setPinGlobalTxToPhysicalConnection(true);
    
            return mysql;
        }
    
        /**
         * Gets the atomikos data source.
         *
         * @param resourceName the resource name
         * @param xaDataSource the xa data source
         * @return the atomikos data source
         */
        private AtomikosDataSourceBean getAtomikosDataSource(final String resourceName, final MysqlXADataSource xaDataSource) {
    
            final AtomikosDataSourceBean atomikos = new AtomikosDataSourceBean();
            atomikos.setUniqueResourceName(resourceName);
            atomikos.setXaDataSource(xaDataSource);
            atomikos.setMaxLifetime(3600);
            atomikos.setMinPoolSize(2);
            atomikos.setMaxPoolSize(10);
    
            return atomikos;
        }
    
    }
    
    
    package org.sample.settlement.batch;
    
    import javax.sql.DataSource;
    
    import org.slf4j.Logger;
    import org.slf4j.LoggerFactory;
    import org.springframework.batch.core.Job;
    import org.springframework.batch.core.Step;
    import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
    import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
    import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
    import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
    import org.springframework.batch.core.launch.support.RunIdIncrementer;
    import org.springframework.beans.factory.annotation.Autowired;
    import org.springframework.beans.factory.annotation.Qualifier;
    import org.springframework.context.annotation.Bean;
    import org.springframework.context.annotation.Configuration;
    import org.springframework.transaction.PlatformTransactionManager;
    
    /**
     * The Class BatchConfiguration.
     *
     */
    @Configuration
    @EnableBatchProcessing
    public class BatchConfiguration extends DefaultBatchConfigurer {
        private final static Logger log = LoggerFactory.getLogger(BatchConfiguration.class);
        @Autowired private JobBuilderFactory jobs;
        @Autowired private StepBuilderFactory steps;
        @Autowired private PlatformTransactionManager transactionManager;
        @Autowired @Qualifier("processStep") private Step processStep;
    
        /**
         * Process payments job.
         *
         * @return the job
         */
        @Bean(name = "processJob")
        public Job processJob() {
            return this.jobs.get("processJob")
                        .incrementer(new RunIdIncrementer())
                        .start(processStep)
                        .build();
        }
    
        @Override
        @Autowired
        public void setDataSource(@Qualifier("batchDataSource") DataSource batchDataSource) {
            super.setDataSource(batchDataSource);
        }
    }
    
    0 讨论(0)
  • 2020-11-29 22:57

    Add @BatchDataSource to the batch data source if your spring boot's version is 2.2.0 or later.

    Details of this annotation is as follows:

    /**
     * Qualifier annotation for a DataSource to be injected into Batch auto-configuration. Can
     * be used on a secondary data source, if there is another one marked as
     * {@link Primary @Primary}.
     *
     * @author Dmytro Nosan
     * @since 2.2.0
     */
    @Target({ ElementType.FIELD, ElementType.METHOD, ElementType.PARAMETER, ElementType.TYPE, ElementType.ANNOTATION_TYPE })
    @Retention(RetentionPolicy.RUNTIME)
    @Documented
    @Qualifier
    public @interface BatchDataSource {
    
    }
    

    for example:

    @BatchDataSource
    @Bean("batchDataSource")
    public DataSource batchDataSource(@Qualifier("batchDataSourceProperties") DataSourceProperties dataSourceProperties) {
            return dataSourceProperties
                    .initializeDataSourceBuilder()
                    .type(HikariDataSource.class)
                    .build();
    }
    
    0 讨论(0)
  • 2020-11-29 23:00

    As suggested by Frozen in his answer two DataSources did the trick for me. Additionally I needed to define a BatchDataSourceInitializer to properly initialize the batch DataSource as suggested in Michael Minella's answer to this related question.

    DataSource configuration

    @Configuration
    public class DataSourceConfiguration {
    
        @Bean
        @Primary
        @ConfigurationProperties("domain.datasource")
        public DataSource domainDataSource() {
            return DataSourceBuilder.create().build();
        }
    
        @Bean("batchDataSource")
        @ConfigurationProperties("batch.datasource")
        public DataSource batchDataSource() {
            return DataSourceBuilder.create().build();
        }
    }
    

    Batch Configuration

    @Configuration
    @EnableBatchProcessing
    public class BatchConfiguration extends DefaultBatchConfigurer {
    
        @Override
        @Autowired
        public void setDataSource(@Qualifier("batchDataSource") DataSource batchDataSource) {
            super.setDataSource(batchDataSource);
        }
    
        @Bean
        public BatchDataSourceInitializer batchDataSourceInitializer(@Qualifier("batchDataSource") DataSource batchDataSource,
                ResourceLoader resourceLoader) {
            return new BatchDataSourceInitializer(batchDataSource, resourceLoader, new BatchProperties());
        }
    

    application.properties:

    # Sample configuraion using a H2 in-memory DB
    domain.datasource.jdbcUrl=jdbc:h2:mem:domain-ds;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
    domain.datasource.username=sa
    domain.datasource.password=
    domain.datasource.driver=org.h2.Driver
    
    batch.datasource.jdbcUrl=jdbc:h2:mem:batch-ds;DB_CLOSE_DELAY=-1;DB_CLOSE_ON_EXIT=FALSE
    batch.datasource.username=sa
    batch.datasource.password=
    batch.datasource.driver=org.h2.Driver
    
    0 讨论(0)
  • 2020-11-29 23:02

    Ok, this is strange but it works. Moving the datasources to it's own configuration class works just fine and one is able to autowire.

    The example is a multi-datasource version of Spring Batch Service Example:

    DataSourceConfiguration:

    public class DataSourceConfiguration {
    
        @Value("classpath:schema-mysql.sql")
        private Resource schemaScript;
    
        @Bean
        @Primary
        public DataSource hsqldbDataSource() throws SQLException {
            final SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
            dataSource.setDriver(new org.hsqldb.jdbcDriver());
            dataSource.setUrl("jdbc:hsqldb:mem:mydb");
            dataSource.setUsername("sa");
            dataSource.setPassword("");
            return dataSource;
        }
    
        @Bean
        public JdbcTemplate jdbcTemplate(final DataSource dataSource) {
            return new JdbcTemplate(dataSource);
        }
    
        @Bean
        public DataSource mysqlDataSource() throws SQLException {
            final SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
            dataSource.setDriver(new com.mysql.jdbc.Driver());
            dataSource.setUrl("jdbc:mysql://localhost/spring_batch_example");
            dataSource.setUsername("test");
            dataSource.setPassword("test");
            DatabasePopulatorUtils.execute(databasePopulator(), dataSource);
            return dataSource;
        }
    
        @Bean
        public JdbcTemplate mysqlJdbcTemplate(@Qualifier("mysqlDataSource") final DataSource dataSource) {
            return new JdbcTemplate(dataSource);
        }
    
        private DatabasePopulator databasePopulator() {
            final ResourceDatabasePopulator populator = new ResourceDatabasePopulator();
            populator.addScript(schemaScript);
            return populator;
        }
    }
    

    BatchConfiguration:

    @Configuration
    @EnableBatchProcessing
    @Import({ DataSourceConfiguration.class, MBeanExporterConfig.class })
    public class BatchConfiguration {
    
        @Autowired
        private JobBuilderFactory jobs;
    
        @Autowired
        private StepBuilderFactory steps;
    
        @Bean
        public ItemReader<Person> reader() {
            final FlatFileItemReader<Person> reader = new FlatFileItemReader<Person>();
            reader.setResource(new ClassPathResource("sample-data.csv"));
            reader.setLineMapper(new DefaultLineMapper<Person>() {
                {
                    setLineTokenizer(new DelimitedLineTokenizer() {
                        {
                            setNames(new String[] { "firstName", "lastName" });
                        }
                    });
                    setFieldSetMapper(new BeanWrapperFieldSetMapper<Person>() {
                        {
                            setTargetType(Person.class);
                        }
                    });
                }
            });
            return reader;
        }
    
        @Bean
        public ItemProcessor<Person, Person> processor() {
            return new PersonItemProcessor();
        }
    
        @Bean
        public ItemWriter<Person> writer(@Qualifier("mysqlDataSource") final DataSource dataSource) {
            final JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
            writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<Person>());
            writer.setSql("INSERT INTO people (first_name, last_name) VALUES (:firstName, :lastName)");
            writer.setDataSource(dataSource);
            return writer;
        }
    
        @Bean
        public Job importUserJob(final Step s1) {
            return jobs.get("importUserJob").incrementer(new RunIdIncrementer()).flow(s1).end().build();
        }
    
        @Bean
        public Step step1(final ItemReader<Person> reader,
                final ItemWriter<Person> writer, final ItemProcessor<Person, Person> processor) {
            return steps.get("step1")
                    .<Person, Person> chunk(1)
                    .reader(reader)
                    .processor(processor)
                    .writer(writer)
                    .build();
        }
    }
    
    0 讨论(0)
  • 2020-11-29 23:13

    Per https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#howto-two-datasources:

    @Bean
    @Primary
    @ConfigurationProperties("app.datasource.first")
    public DataSourceProperties firstDataSourceProperties() {
        return new DataSourceProperties();
    }
    
    @Bean
    @Primary
    @ConfigurationProperties("app.datasource.first")
    public DataSource firstDataSource() {
        return firstDataSourceProperties().initializeDataSourceBuilder().build();
    }
    
    @Bean
    @ConfigurationProperties("app.datasource.second")
    public DataSourceProperties secondDataSourceProperties() {
        return new DataSourceProperties();
    }
    
    @Bean
    @ConfigurationProperties("app.datasource.second")
    public DataSource secondDataSource() {
        return secondDataSourceProperties().initializeDataSourceBuilder().build();
    }
    

    In the application properties, you can use regular datasource properties:

    app.datasource.first.type=com.zaxxer.hikari.HikariDataSource
    app.datasource.first.maximum-pool-size=30
    
    app.datasource.second.url=jdbc:mysql://localhost/test
    app.datasource.second.username=dbuser
    app.datasource.second.password=dbpass
    app.datasource.second.max-total=30
    
    0 讨论(0)
提交回复
热议问题