spring-cloud-dataflow

Processor App in Spring Boot 2.2.4/Hoxton.SR1 not working in Spring Cloud Data Flow 2.4.1

守給你的承諾、 提交于 2020-04-18 03:49:07
问题 I am trying to develop a new application to work on SCDF 2.4.1 and Skipper 2.3.1 I took the samples from https://github.com/sabbyanandan/stream-programming-models I built them locally. Downloaded the docker compose for SCDF kafka, set the Versions and mount my repo and start my docker compose. When I deploy the "function" module and create a simple stream http | customUpper | log I see the sample working fine and able to see log output as expected. When I modify the function stream app, to

How can I register SCDF app with microk8s?

╄→尐↘猪︶ㄣ 提交于 2020-04-17 20:26:02
问题 I have installed SCDF in a microk8s cluster. Ubuntu runs in VirtualBox. Now I'm trying to register a custom app in SCDF. My app is build as a Docker image (myorg/myapp:latest) and registered in a private local registry (localhost:5000). I followed the microk8s documentation here https://microk8s.io/docs/registry-private and add my Docker registry in the containerd-template.toml file : [plugins.cri.registry.mirrors."myorg"] endpoint = ["http://localhost:5000"] But now I can't figure out how to

How can I register SCDF app with microk8s?

有些话、适合烂在心里 提交于 2020-04-17 20:22:34
问题 I have installed SCDF in a microk8s cluster. Ubuntu runs in VirtualBox. Now I'm trying to register a custom app in SCDF. My app is build as a Docker image (myorg/myapp:latest) and registered in a private local registry (localhost:5000). I followed the microk8s documentation here https://microk8s.io/docs/registry-private and add my Docker registry in the containerd-template.toml file : [plugins.cri.registry.mirrors."myorg"] endpoint = ["http://localhost:5000"] But now I can't figure out how to

Skipper https rest end point requests returning http urls

て烟熏妆下的殇ゞ 提交于 2020-03-05 03:10:29
问题 I am trying a poc with Spring cloud dataflow streams and have the the application iis running in Pivotal Cloud Foundry. Trying the same in kubernetes and the spring dataflow server dashboard is not loading.Debugged the issue and found the root cause is when the dashboard is loaded, its trying to hit the Skipper rest end point /api and this returns a response with the urls of other end points in skipper but the return urls are all in http. How can i force skipper to return https urls instead

Skipper https rest end point requests returning http urls

天涯浪子 提交于 2020-03-05 03:10:17
问题 I am trying a poc with Spring cloud dataflow streams and have the the application iis running in Pivotal Cloud Foundry. Trying the same in kubernetes and the spring dataflow server dashboard is not loading.Debugged the issue and found the root cause is when the dashboard is loaded, its trying to hit the Skipper rest end point /api and this returns a response with the urls of other end points in skipper but the return urls are all in http. How can i force skipper to return https urls instead

Spring dataflow and GCP Pub Sub

断了今生、忘了曾经 提交于 2020-03-04 16:41:22
问题 I'm building an event-driven microservice architecture, which is supposed to be Cloud agnostic (as much as possible). Since this is initially going in GCP and I don't want to spend a long time in configurations and all that, I was going to use GCP's Pub/Sub directly for the event queue and would take care of other Cloud implementations later, but then I came across Spring Cloud Dataflow, which seemed nice because these are Spring Boot microservices and I needed a way to orchestrate them. Does

Does spring-cloud-dataflow provide support for scheduling applications defined as tasks?

霸气de小男生 提交于 2020-02-20 06:10:12
问题 I have been looking at using projects built using spring-cloud-task within spring-cloud-dataflow. Having looked at the example projects and the documentation, the indication seems to be that tasks are launched manually through the dashboard or the shell. Does spring-cloud-dataflow provide any way of scheduling task definitions so that they can run for example on a cron schedule? I.e. Can you create a spring-cloud-task app which itself has no knowledge of a schedule, but deploy it to the

Spring Cloud Dataflow Kubernetes get properties of jar from dockerfile

纵然是瞬间 提交于 2020-01-25 09:48:06
问题 How can i read the properties of the JAR i have created as a task when I import it as a task app in the spring cloud dataflow kubernetes server via docker image url? 回答1: You'll notice the following in the Installation section of the reference guide. Currently, only applications registered with a --uri property pointing to a Docker resource are supported by the Data Flow Server for Kubernetes. However, we do support Maven resources for the --metadata-uri property, which is used to list the

How to execute spring cloud task using rest-api

非 Y 不嫁゛ 提交于 2020-01-24 19:29:25
问题 I know a cloud task can be scheduled and can be configured using stream also to be executed. As a developer I want to execute my spring cloud task using rest-api so that I can execute the task on demand. Basically i have a work flow management system and we are using control-m agent. So now some of the jobs will be executed by control-m and some of the task will be deployed on spring cloud dataflow server. Now when one job completes then other job which is there on cloud has to be executed.

How to execute spring cloud task using rest-api

混江龙づ霸主 提交于 2020-01-24 19:29:09
问题 I know a cloud task can be scheduled and can be configured using stream also to be executed. As a developer I want to execute my spring cloud task using rest-api so that I can execute the task on demand. Basically i have a work flow management system and we are using control-m agent. So now some of the jobs will be executed by control-m and some of the task will be deployed on spring cloud dataflow server. Now when one job completes then other job which is there on cloud has to be executed.