spring-cloud-dataflow

Change content type for RabbitMQ Spring Cloud Stream Starter App

你说的曾经没有我的故事 提交于 2019-12-24 12:07:06
问题 The documentation for the Spring Cloud Stream Starter Apps for the RabbitMQ Source app lists several possible content types, each with a different resulting type for the output payload. However, it doesn't say how to choose which one you want to use. I'm deploying a Spring Cloud Data Flow connecting the Rabbit source to a Log sink, and all I get is the byte array. Even when I explicitly set the content type to "text/plain" in the Rabbit message's header, it shows up in the log sink as a byte

Spring Cloud dataflow: Register new custom kryo serializer

北战南征 提交于 2019-12-23 05:44:07
问题 I am creating a system using cloud-dataflow, I have a source and a transformer. We like to use kryo, but some of our classes require custom kryo serializers, I have written serializers before. We are now using spring-integration-core-4.3.11, and since 4.2 the model has changes(per the docs) to use Codecs instead of MessageConverter interface. The question is, how can I register the kryo serializers in the new Codec framework, do I make a new Codec implementation inheriting from MessageCodec?

App properties in Spring Cloud Data Flow application

与世无争的帅哥 提交于 2019-12-23 04:24:06
问题 Based on the documentation for Spring Cloud Data Flow (SCDF) only properties that are prefixed by either "deployed." or "app." are considered when deploying an application (be it a source, processor or sink) as part of a stream. However, I've noticed that besides the prefix, all the properties must be provided as "strings", no matter what their original type is; otherwise, they are simply discarded by SCDF as per this line of code: propertiesToUse = DeploymentPropertiesUtils.convert(props);

Spring Cloud Dataflow Type conversion not working in processor component?

依然范特西╮ 提交于 2019-12-20 02:55:06
问题 I have a processor which transforms byte[] payloads into MyClass payloads: @Slf4j @EnableBinding(Processor.class) public class MyDecoder { @ServiceActivator(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT) public MyClass decode(final byte[] payload) { MyClass decoded = doStuff(payload); if (decoded != null) { log.info("Successfully decoded!"); } return decoded; } } I tried creating the following DSL : some-source | my-decoder | some-sink and some-sink reports errors because

How to fix Spring Cloud Data Flow Kubernetes container Readiness probe failed: HTTP probe failed with statuscode: 401

自作多情 提交于 2019-12-14 03:13:43
问题 Have deployed Spring Cloud Data flow on Azure AKS using Helm: helm install --name my-release stable/spring-cloud-data-flow Data Flow Server Implementation Name: spring-cloud-dataflow-server Version: 2.0.1.RELEASE But getting Liveness probe and Readiness probe failed 401: Events: Type Reason Age From Message ---- ------ ---- ---- ------- Warning Unhealthy 10m (x52 over 103m) kubelet, aks-nodepool1-28921497-0 Liveness probe failed: HTTP probe failed with statuscode: 401 Warning BackOff 6m8s

Spring Data Flow w/ 2 sources feeding one processor/sink

[亡魂溺海] 提交于 2019-12-13 07:34:51
问题 I'm looking for some advice on setting up a Spring Data Flow stream for a specific use case. My use case: I have 2 RDBMS and I need to compare the results of queries run against each. The queries should be run roughly simultaneously. Based on the result of the comparison, I should be able to send an email through a custom email sink app which I have created. I envision the stream diagram to look something like this (sorry for the paint): The problem is that SDF does not, to my knowledge,

Jar not found error while while trying to deploy SCDF Stream

╄→尐↘猪︶ㄣ 提交于 2019-12-13 04:13:15
问题 I registered the sink first as follows: app register --name mysink --type sink --uri file:///Users/swatikaushik/Downloads/kafkaStreamDemo/target/kafkaStreamDemo-0.0.1-SNAPSHOT.jar Then I created a stream stream create --definition “:myKafkaTopic > mysink" --name myStreamName --deploy I got the error Command failed org.springframework.cloud.dataflow.rest.client.DataFlowClientException: File /Users/swatikaushik/Downloads/kafkaStreamDemo/target/kafkaStreamDemo-0.0.1-SNAPSHOT.jar must exist While

one SCDF source, 2 processors but only 1 processes each item

依然范特西╮ 提交于 2019-12-13 03:56:07
问题 My use case is a variation on this: Create Stream with one source, two parallel processors and one sink in Spring Cloud Data Flow In the example, 1 source emits an item to rabbitmq and both processors get it. I want the opposite. I want the source to emit items to rabbitmq but only 1 processor handles each item. Lets pretend I have: 1 source named source 2 processors named processor1 and processor2 So source emits: A, B, C to rabbitmq RabbitMQ will emit A Whichever processor gets A first will

scdf 1.7.3 docker k8s function-runner not start

风格不统一 提交于 2019-12-13 03:41:32
问题 Trying to deploy into scdf k8s a function-runner into a stream http --server.port=9001 | f-run: function-runner --function.className=com.example.functions.CharCounter --class-name=com.example.functions.CharCounter --location="maven://io.spring.sample:function-sample:jar:1.0.2" | log I've create a docker image using function-runner-kafka 1.1.0.M1 . Always get : *************************** APPLICATION FAILED TO START *************************** Description: Binding to target org.springframework

Publish null/tombstone message with raw headers

为君一笑 提交于 2019-12-12 05:12:38
问题 I am building a Spring Cloud Stream Kafka processor app that will consume raw data with a String key and sometimes a null payload from a Kafka topic. I want to produce to another topic a String key and the null payload (known as a tombstone within Kafka). In order to use raw headers on the message, I need to output a byte[] , but if I encode KafkaNull.INSTANCE into a byte[] it will literally output a String of the object hashcode. If I try to send anything other than a byte[] , I can't use