Spring boot connection pool autoconfiguration for more than one datasource

守給你的承諾、 提交于 2019-12-13 19:11:39

问题


I have a web application written using spring boot and Spring batch job which reads from CSV and writes data to RDS ( With MySQL engine ) table using spring jdbcTemplate. I have separated the spring batch specific job meta data tables to separate database so as to avoid confusion with my business logic tables. I need to bring connection pool capability for both the databases. I am using default connection pool provided by spring boot ( Tomcat jdbc pool ). I am configuring the multiple jdbctemplate for both the databases as below in my configuratinon file. I have to put @Primary annotation for the jobMetaData table because spring batch job will fail to identify its database other wise. Because of this when I checked jconsole I found there is only 1 connection pool is created for the @Primary datasource. How can I enable connection pooling for both the datasources in this case ?

My data source configuration section is :

 @Primary
    @Bean(name = "mysqlDs")
    @ConfigurationProperties(prefix = "datasource.sql.jobMetaDataDb")
    public DataSource sqlDataSource() {
        return DataSourceBuilder.create().build();
    }

    @Bean(name = "mysql")
    @Autowired
    public JdbcTemplate slaveJdbcTemplate(@Qualifier("mysqlDs") DataSource mysqlDs) {
        return new JdbcTemplate(mysqlDs);
    }


    @Bean(name = "rdsDataSource")
    @ConfigurationProperties(prefix = "datasource.sql.rdsWriterDb")
    public DataSource rdsDataSource() {
        return DataSourceBuilder.create().build();
    }

    @Bean(name = "rdsJdbcTemplate")
    @Autowired
    public JdbcTemplate rdsJdbcTemplate(@Qualifier("rdsDataSource") DataSource rdsDataSource) {
        return new JdbcTemplate(rdsDataSource);
    }

Properties file for database specific configuration is :

#Mysql db end point confiugrations for
    #spring job metadata.

    datasource.sql.jobMetaDataDb.url=jdbc:mysql://localhost/jobMetadata?characterEncoding=UTF-8
    datasource.sql.jobMetaDataDb.username=root
    datasource.sql.jobMetaDataDb.password=root
    datasource.sql.jobMetaDataDb.driverClassName=com.mysql.jdbc.Driver
    datasource.sql.jobMetaDataDb.jmx-enabled=true
    #Configuration to avoid wait_timeout
    datasource.sql.jobMetaDataDb.testWhileIdle = true
    datasource.sql.jobMetaDataDb.timeBetweenEvictionRunsMillis = 7200000
    datasource.sql.jobMetaDataDb.validationQuery = SELECT 1

    #Database configuration for RdsWriter.
    datasource.sql.rdsWriterDb.url=jdbc:mysql://localhost/rds?characterEncoding=UTF-8
    datasource.sql.rdsWriterDb.username=root
    datasource.sql.rdsWriterDb.password=root
    datasource.sql.rdsWriterDb.driverClassName=com.mysql.jdbc.Driver
    #Configuration to avoid wait_timeout
    datasource.sql.rdsWriterDb.testWhileIdle = true
    datasource.sql.rdsWriterDb.timeBetweenEvictionRunsMillis = 7200000
    datasource.sql.rdsWriterDb.validationQuery = SELECT 1
    datasource.sql.rdsWriterDb.jmx-enabled=true
    spring.datasource.jmx-enabled=true

I am writing data to RDS using spring batch writer class like below by autowiring jdbcTemplate.

package com.fastretailing.catalogPlatformSCMProducer.producerjob.writer.rds;

import com.fastretailing.catalogPlatformSCMProducer.constants.ProducerJobConstants;
import com.fastretailing.catalogPlatformSCMProducer.model.Configuration;
import com.fastretailing.catalogPlatformSCMProducer.model.ProducerMessage;
import com.fastretailing.catalogPlatformSCMProducer.model.RDSColumnInfo;
import com.fastretailing.catalogPlatformSCMProducer.notification.JobStatus;
import com.fastretailing.catalogPlatformSCMProducer.util.ProducerUtil;
import com.fastretailing.catalogPlatformSCMProducer.util.QueryGenerator;
import org.apache.commons.lang3.exception.ExceptionUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.item.support.AbstractItemStreamItemWriter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.dao.EmptyResultDataAccessException;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.transaction.annotation.Transactional;

import java.sql.BatchUpdateException;
import java.sql.Statement;
import java.util.ArrayList;
import java.util.List;

/**
 * Producer Job writer for writing directly to RDS.
 */
public class RdsWriter extends AbstractItemStreamItemWriter<ProducerMessage> {

    @Autowired
    @Qualifier("rdsJdbcTemplate")
    JdbcTemplate rdsJdbcTemplate;

    @Autowired
    Configuration configuration;

    QueryGenerator queryGenerator;
    private final Logger LOGGER = LoggerFactory.getLogger(this.getClass());

    @Override
    public void write(List<? extends ProducerMessage> list) throws Exception {

            handleRecord(list);
    }

    @Transactional
    public void handleRecord(List<? extends ProducerMessage> list) {
        List<Object[]> objectList_insert = new ArrayList<>();
        List<Object> insertObject = null;
        String tableName = null;

            for (ProducerMessage message : list) {

                try {
                //Creating query generator once for job.
                insertObject = new ArrayList<>();
                if (null == queryGenerator) {
                    queryGenerator = new QueryGenerator(message);
                }
                if (null == tableName)
                    tableName = message.getTableName();
                String timestampValidationPS = ProducerUtil.generateTimestampCheckPS(message);
                Long returnValue;
                try {
                    returnValue = rdsJdbcTemplate.queryForObject(timestampValidationPS,
                            ProducerUtil.generatePrimaryKeyObjectList(message), Long.class);
                } catch (EmptyResultDataAccessException e) {
                    LOGGER.debug("Primary key not exists in RDS table. This will insert new row");
                    returnValue = null;
                }

                if (null == returnValue || returnValue <= message.getTimeStamp()) {

                    for (RDSColumnInfo columnInfo : message.getRecord()) {
                        //Adding null value insertion in case of non varchar type and empty value in CSV.
                        if(columnInfo.getRdsColumnValue().isEmpty() && !columnInfo.getRdsVarType().equalsIgnoreCase(ProducerJobConstants.TYPE_VARCHAR)){
                            insertObject.add(null);
                        }else {
                            insertObject.add(columnInfo.getRdsColumnValue());
                        }
                    }
                    objectList_insert.add(insertObject.toArray());

                } else {
                    JobStatus.addRowsSkippedWriting(1);
                    LOGGER.debug("Skipped row due to timestamp check failure for feedName {}",
                            message.getFeedConfigName());
                }

                } catch (Exception e) {
                    JobStatus.changeStatus(ProducerUtil.SNS_NOTIFICATION_EVENT_IN_COMPLETE);
                    JobStatus.addExceptionInLogWriter(ExceptionUtils.getStackTrace(e));
                    JobStatus.addRowsSkippedWriting(1);
                    LOGGER.error("Exception while processing records for RDS write. These records will be skipped from writing.",
                            e);
                }
            }

            try {
            if (objectList_insert != null) {
                String insertQuery = queryGenerator.generateRdsInsertPS(tableName);
                LOGGER.debug("Executing Query  {}", insertQuery);
                rdsJdbcTemplate.batchUpdate(insertQuery, objectList_insert);
                JobStatus.addRowsWritten(objectList_insert.size()   );
            }

        } catch (Exception e) {
                //Handling batchUpdateException for rows written count.
            if(ExceptionUtils.indexOfThrowable(e,BatchUpdateException.class ) != -1){
                BatchUpdateException be = (BatchUpdateException) e.getCause();
                handleUpdateCountOnException(be);
            }
            JobStatus.changeStatus(ProducerUtil.SNS_NOTIFICATION_EVENT_IN_COMPLETE);
            JobStatus.addExceptionInLogWriter(ExceptionUtils.getStackTrace(e));
            LOGGER.error("Exception while writing records to RDS table. These records will be skipped from writing.",
                    e);
        }

    }

    private void handleUpdateCountOnException(BatchUpdateException be){
        for (int count : be.getUpdateCounts()){
            if (count == Statement.EXECUTE_FAILED) {
                JobStatus.addRowsSkippedWriting(1);
            }else {
                JobStatus.addRowsWritten(1);
            }
        }
    }
}

来源:https://stackoverflow.com/questions/37163160/spring-boot-connection-pool-autoconfiguration-for-more-than-one-datasource

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!