flink-cep

Flink exactly-once message processing

前提是你 提交于 2019-12-12 18:17:27
问题 I've setup a Flink 1.2 standalone cluster with 2 JobManagers and 3 TaskManagers and I'm using JMeter to load-test it by producing Kafka messages / events which are then processed. The processing job runs on a TaskManager and it usually takes ~15K events/s. The job has set EXACTLY_ONCE checkpointing and is persisting state and checkpoints to Amazon S3. If I shutdown the TaskManager running the job it takes a bit, a few seconds, then the job is resumed on a different TaskManager. The job mainly

Flink state backend for TaskManager

三世轮回 提交于 2019-12-12 03:33:40
问题 I have a Flink v1.2 setup with 1 JobManager, 2 TaskManagers each in it's own VM. I configured the state backend to filesystem and pointed it to a local location in the case of each of the above hosts (state.backend.fs.checkpointdir: file:///home/ubuntu/Prototype/flink/flink-checkpoints). I have set parallelism to 1 and each taskanager has 1 slot. I then run an event processing job on the JobManager which assigns it to a TaskManager. I kill the TaskManager running the job and after a few

FlinkCEP: Can I reference an earlier event to define a subsequent match?

心不动则不痛 提交于 2019-12-11 17:19:14
问题 Here is a simple example: val pattern = Pattern.begin[Event]("start").where(_.getId == 42). next("middle").subtype(classOf[SubEvent]).where(x => x.getVolume == **first event matched**.getVolume) ... Essentially the second event ("middle") need to access the state of the first event ("start"). Is it possible to do this within FlinkCEP without requiring an external state? 回答1: Sure. You can get events by for a specific pattern with the help of Context. new IterativeCondition<Event>() { private

Simple Scala API for CEP example don't show any output

旧街凉风 提交于 2019-12-08 05:25:05
问题 I'm programming a simple example for testing the new Scala API for CEP in Flink, using the latest Github version for 1.1-SNAPSHOT. The Pattern is only a check for a value, and outputs a single String as a result for each pattern matched. Code is as follows: val pattern : Pattern[(String, Long, Int), _] = Pattern.begin("start").where(_._3 < 4) val cepEventAlert = CEP.pattern(streamingAlert, pattern) def selectFn(pattern : mutable.Map[String, (String, Long, Int)]): String = { val startEvent =

Flink CEP: Which method to join data streams for different type of events?

廉价感情. 提交于 2019-12-06 10:45:54
Suppose that I have 2 different types of data streams, one providing weather data and the other providing vehicle data, and I would like to use Flink to do complex event processing on the data. Which method in Flink 1.3.x is the correct method to use? I saw different methods like Union, Connect, Window Join. Basically I just want to try a simple CEP like this: IF weather is wet AND vehicle speed > 60 WITHIN the last 10 seconds THEN raise alert Thanks! In my opinion, there are two ways how you can solve this problem: Use a common parent type for different types of events and connect two streams

Is it possible to process multiple streams in apache flink CEP?

老子叫甜甜 提交于 2019-12-06 08:46:07
My Question is that, if we have two raw event streams i.e Smoke and Temperature and we want to find out if complex event i.e Fire has happened by applying operators to raw streams, can we do this in Flink? I am asking this question because all the examples that I have seen till now for Flink CEP include only one input stream. Please correct me if I am wrong. Short Answer - Yes, you can read and process multiple streams and fire rules based on your event types from the different stream source. Long answer - I had a somewhat similar requirement and My answer is based on the assumption that you

Flink and Dynamic templates recognition

。_饼干妹妹 提交于 2019-12-04 19:07:15
We plan to use Flink CEP for processing a big amount of events according to some dynamic templates. The system must recognize chains of events (sometimes complicated chains with conditions and grouping). The templates will be created by user. In other words we have to create complicated templates without touching the code. Is it possible to use Apache Flink for solving this problem? Does Filnk support dynamic-templates? At the moment Flink's CEP library does not support this kind of dynamic rule adaption. However, there is no fundamental reason which makes it impossible to implement. In fact,

Unable to execute CEP pattern in Flink dashboard version 1.3.2 which is caused by ClassNotFoundException

这一生的挚爱 提交于 2019-12-02 17:26:58
问题 I have written a simple pattern like this Pattern<JoinedEvent, ?> pattern = Pattern.<JoinedEvent>begin("start") .where(new SimpleCondition<JoinedEvent>() { @Override public boolean filter(JoinedEvent streamEvent) throws Exception { return streamEvent.getRRInterval()>= 10 ; } }).within(Time.milliseconds(WindowLength)); and it executes well in IntellijIdea. I am using Flink 1.3.2 both in the dashboard and in IntelliJ-Idea. While I was building Flink from source, I have seen a lot of warning

java.io.NotSerializableException using Apache Flink with Lagom

耗尽温柔 提交于 2019-12-02 15:10:14
问题 I am writing Flink CEP program inside the Lagom's Microservice Implementation. My FLINK CEP program run perfectly fine in simple scala application. But when i use this code inside the Lagom service implementation i am receiving the following exception Lagom Service Implementation override def start = ServiceCall[NotUsed, String] { val env = StreamExecutionEnvironment.getExecutionEnvironment var executionConfig = env.getConfig env.setParallelism(1) executionConfig.disableSysoutLogging() var

Unable to execute CEP pattern in Flink dashboard version 1.3.2 which is caused by ClassNotFoundException

泪湿孤枕 提交于 2019-12-02 10:30:52
I have written a simple pattern like this Pattern<JoinedEvent, ?> pattern = Pattern.<JoinedEvent>begin("start") .where(new SimpleCondition<JoinedEvent>() { @Override public boolean filter(JoinedEvent streamEvent) throws Exception { return streamEvent.getRRInterval()>= 10 ; } }).within(Time.milliseconds(WindowLength)); and it executes well in IntellijIdea. I am using Flink 1.3.2 both in the dashboard and in IntelliJ-Idea. While I was building Flink from source, I have seen a lot of warning messages which led me to believe that iterative condition classes have not been included in a jar as error