问题
I'm working on Spring batch and SCDF
example. In this example I am reading CSV
file and loading all the data to MySQL
. I was able to successfully load the data into MySQL DB, but but UI doesn't show me START_TIME
and END_TIME
, even db doesn't holds any records.
I've uploaded my code here: https://github.com/JavaNeed/spring-cloud-dataflow-example1.git
spring-cloud-data-flow-server
SpringCloudDataFlowServerApplication.java
@EnableDataFlowServer
@SpringBootApplication(exclude = { CloudFoundryDeployerAutoConfiguration.class})
public class SpringCloudDataFlowServerApplication {
public static void main(String[] args) {
SpringApplication.run(SpringCloudDataFlowServerApplication.class, args);
}
}
application.properties
spring.datasource.url=jdbc:mysql://localhost:3306/test
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=org.mariadb.jdbc.Driver
spring.cloud.dataflow.features.streams-enabled=false
#spring.cloud.dataflow.rdbms.initialize.enable=false
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.6.RELEASE</version>
<relativePath /> <!-- lookup parent from repository -->
</parent>
<groupId>com.example</groupId>
<artifactId>spring-cloud-data-flow-server</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>spring-cloud-data-flow-server</name>
<description>Demo project for Spring Boot</description>
<properties>
<maven-jar-plugin.version>3.1.1</maven-jar-plugin.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
<spring-cloud-starter-dataflow-server.version>2.4.2.RELEASE</spring-cloud-starter-dataflow-server.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-dataflow-server</artifactId>
<version>${spring-cloud-starter-dataflow-server.version}</version>
</dependency>
<dependency>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-core</artifactId>
<version>6.4.1</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
JobConfig.java
@Configuration
public class JobConfig {
@Autowired
private StepBuilderFactory stepBuilderFactory;
@Autowired
private JobBuilderFactory jobBuilderFactory;
@Autowired
private DataSource dataSource;
@Bean
public FlatFileItemReader<Customer> customerItemReader(){
FlatFileItemReader<Customer> reader = new FlatFileItemReader<>();
reader.setLinesToSkip(1);
reader.setResource(new ClassPathResource("/data/customer.csv"));
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
tokenizer.setNames(new String[] {"id", "firstName", "lastName", "birthdate"});
DefaultLineMapper<Customer> customerLineMapper = new DefaultLineMapper<>();
customerLineMapper.setLineTokenizer(tokenizer);
customerLineMapper.setFieldSetMapper(new CustomerFieldSetMapper());
customerLineMapper.afterPropertiesSet();
reader.setLineMapper(customerLineMapper);
return reader;
}
@Bean
public JdbcBatchItemWriter<Customer> customerItemWriter(){
JdbcBatchItemWriter<Customer> writer = new JdbcBatchItemWriter<>();
writer.setDataSource(this.dataSource);
writer.setSql("INSERT INTO CUSTOMER VALUES (:id, :firstName, :lastName, :birthdate)");
writer.setItemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>());
writer.afterPropertiesSet();
return writer;
}
@Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Customer, Customer> chunk(10)
.reader(customerItemReader())
.writer(customerItemWriter())
.build();
}
@Bean
public Job job() {
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.start(step1())
.build();
}
}
application.properties
# MYSQL DB Details
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.url=jdbc:mysql://localhost:3306/test
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.schema=org/springframework/batch/core/schema-mysql.sql
spring.batch.initialize-schema=ALWAYS
#spring.batch.schema=classpath:schema-mysql.sql
回答1:
From a brief review of databaseOutput application, it appears you aren't using Spring Cloud Task.
Today, SCDF recognizes, resolves, and automates the orchestration of Spring Cloud Stream and Spring Cloud Task applications natively. While other workloads (including polyglot) are possible, only these two types of applications are treated with more care and automation.
In your case, in particular, even if your batch-job successfully did its thing, the transaction boundary is only governed if the batch-job application is wrapped as a Spring Cloud Task application, which is the reason why you don't see transactions recorded in SCDF's database. Likewise, the UI/Shell/API won't show such details either.
I would recommend reviewing the batch developer guide in entirety. Perhaps repeat the guide as-is first to get a feel of how things come together. Once when you get to see them in action (esp. spring-cloud-task
boundaries), you will then be able to retrofit your app to comply with the foundational-level expectations.
来源:https://stackoverflow.com/questions/61566124/spring-cloud-dataflow-and-mysql-doesnt-show-the-start-time-and-end-time-of-the