spring-cloud-dataflow

LoadBalancing Spring cloud data flow server

不打扰是莪最后的温柔 提交于 2020-01-17 06:52:34
问题 In spring cloud dataflow, as per my understanding each stream is a microservice but the dataflow server is not. Am I right? Is it possible to have multiple instances of spring cloud dataflow(SCDF) server? How to loadbalance the dataflow server? I am planning to deploy it in AWS.The official documentation didn't mention anything about loadbalancing of dataflow server. If it is possible how do Dashboard, shell works? 回答1: The SCDF-server is a regular Spring MVC + Spring Boot application that

Spring cloud data flow custom application properties

社会主义新天地 提交于 2020-01-10 06:10:02
问题 I created a custom spring cloud data flow application. I would like to create a stream with it and put some application properties in it, as we can add for the provided application log (0/3 properties): I tried with application.yml file in resources folder : spring: application: toto: 'titi' but it didn't work. I also tried to create some Properties.class public class Properties { //public static final String PREFIX = "portin"; private String toto; public Properties(String toto) { this.toto =

Spring cloud data flow custom application properties

岁酱吖の 提交于 2020-01-10 06:09:57
问题 I created a custom spring cloud data flow application. I would like to create a stream with it and put some application properties in it, as we can add for the provided application log (0/3 properties): I tried with application.yml file in resources folder : spring: application: toto: 'titi' but it didn't work. I also tried to create some Properties.class public class Properties { //public static final String PREFIX = "portin"; private String toto; public Properties(String toto) { this.toto =

Spring cloud data flow custom application properties

给你一囗甜甜゛ 提交于 2020-01-10 06:09:09
问题 I created a custom spring cloud data flow application. I would like to create a stream with it and put some application properties in it, as we can add for the provided application log (0/3 properties): I tried with application.yml file in resources folder : spring: application: toto: 'titi' but it didn't work. I also tried to create some Properties.class public class Properties { //public static final String PREFIX = "portin"; private String toto; public Properties(String toto) { this.toto =

Dataflow Tasks are not working with Spring Batch

前提是你 提交于 2020-01-06 04:42:16
问题 I'm having Spring Batch job that is also dataflow task . When I run this job everything seems OK, In Tasks > Executions I can see that tasks finished successfully. On the other hand when I go to Jobs tabs I'm getting this error (in command line): java.lang.NullPointerException: null at org.springframework.cloud.dataflow.server.service.impl.DefaultTaskJobService.getTaskJobExecution(DefaultTaskJobService.java:240) ~[spring-cloud-dataflow-server-core-1.2.2.RELEASE.jar!/:1.2.2.RELEASE] at org

Create Stream with one source, two parallel processors and one sink in Spring Cloud Data Flow

别说谁变了你拦得住时间么 提交于 2020-01-04 02:33:35
问题 I am trying to create a stream in Spring Cloud Data Flow with One source i.e. order-source and Order message will be published to the RabbitMQ Topic/Queue. Two parallel processors i.e. product-processor and shipment-processor Both of these processors will be subscribers to the RabbitMQ Topic/Queue and gets the Order message and each of them will process these Order message individually and update the Order and the Order message will be published to the RabbitMQ Topic/Queue. One sink i.e.

Spring Cloud data flow does not show Spring cloud task execution details

孤人 提交于 2019-12-25 09:39:31
问题 The Spring cloud dataflow documentation mentions When executing tasks externally (i.e. command line) and you wish for Spring Cloud Data Flow to show the TaskExecutions in its UI, be sure that common datasource settings are shared among the both. By default Spring Cloud Task will use a local H2 instance and the execution will not be recorded to the database used by Spring Cloud Data Flow. I am new to Spring cloud dataflow and spring cloud task. Can somebody help me how to setup a common

Spring Dataflow Move messages from one Rabbit VHost to another

你。 提交于 2019-12-25 01:44:23
问题 TLDR: Can't seem to pass messages from one RabbitMQ VHost to another RabbitMQ VHost. I'm having an issue with Spring Cloud Dataflow where it appears that despite specifying different RabbitMQ VHosts for source and sink, they don't ever get to the destination Exchange. My dataflow stream looks like this: RabbitMQ Source | CustomProcessor | RabbitMQ Sink RabbitMQ Source reads from a queue on vHostA and RabbitMQ Sink should output to ExchangeBlah on vHostB. However, no messages end up on

Run Spring Batch (JSR-352) application on Spring Boot

我是研究僧i 提交于 2019-12-24 18:23:18
问题 I have a simple Spring Batch application complying with JSR-352. I need to deploy this as a managed Task on Spring Cloud Data Flow server. As far as I know - to be able to deploy this as a Task I need to convert this application as a Spring Boot app. I have tried to add Spring Boot dependencies and Main class however it is not running the Batch job when I start the app. Main Class @SpringBootConfiguration @EnableAutoConfiguration @EnableBatchProcessing public class Application { public static

Change content type for RabbitMQ Spring Cloud Stream Starter App

青春壹個敷衍的年華 提交于 2019-12-24 12:09:52
问题 The documentation for the Spring Cloud Stream Starter Apps for the RabbitMQ Source app lists several possible content types, each with a different resulting type for the output payload. However, it doesn't say how to choose which one you want to use. I'm deploying a Spring Cloud Data Flow connecting the Rabbit source to a Log sink, and all I get is the byte array. Even when I explicitly set the content type to "text/plain" in the Rabbit message's header, it shows up in the log sink as a byte