lagom

Best approach to ingest Streaming Data in Lagom Microservice

一世执手 提交于 2019-12-06 11:54:39
问题 I am creating streaming analytics application in which each analytics/functionality will implement as a Microservice so that this analytics can able to use in the different project later. I am using Lagom for creating Microservice. I am new in lagom that's why i came across with some doubts. I don't understand what will be the best approach to Post my stream of data (coming from multiple sensors) to microservice and then this microservice publish data to kafka topic. Does Lagom Feature of

Column family ID mismatch (found cebcc380-72d4-11e7-9a6b-bd620b945799; expected c05d6970-72d4-11e7-9a6b-bd620b945799)

Deadly 提交于 2019-12-06 07:34:27
How can i resolve this error Column family ID mismatch (found cebcc380-72d4-11e7-9a6b-bd620b945799; expected c05d6970-72d4-11e7-9a6b-bd620b945799) Caused by: java.util.concurrent.ExecutionException: org.apache.cassandra.exceptions.ConfigurationException: Column family ID mismatch (found cebcc380-72d4-11e7-9a6b-bd620b945799; expected c05d6970-72d4-11e7-9a6b-bd620b945799) at java.util.concurrent.FutureTask.report(FutureTask.java:122) ~[na:1.8.0_131] at java.util.concurrent.FutureTask.get(FutureTask.java:192) ~[na:1.8.0_131] at org.apache.cassandra.utils.FBUtilities.waitOnFuture(FBUtilities.java

Best approach to ingest Streaming Data in Lagom Microservice

空扰寡人 提交于 2019-12-04 16:47:21
I am creating streaming analytics application in which each analytics/functionality will implement as a Microservice so that this analytics can able to use in the different project later. I am using Lagom for creating Microservice. I am new in lagom that's why i came across with some doubts. I don't understand what will be the best approach to Post my stream of data (coming from multiple sensors) to microservice and then this microservice publish data to kafka topic. Does Lagom Feature of Stream Message in Service Description ServiceCall[ Source[String, NotUsed], Source[String, NotUsed]] is it

java.io.NotSerializableException using Apache Flink with Lagom

耗尽温柔 提交于 2019-12-02 15:10:14
问题 I am writing Flink CEP program inside the Lagom's Microservice Implementation. My FLINK CEP program run perfectly fine in simple scala application. But when i use this code inside the Lagom service implementation i am receiving the following exception Lagom Service Implementation override def start = ServiceCall[NotUsed, String] { val env = StreamExecutionEnvironment.getExecutionEnvironment var executionConfig = env.getConfig env.setParallelism(1) executionConfig.disableSysoutLogging() var

java.io.NotSerializableException using Apache Flink with Lagom

寵の児 提交于 2019-12-02 09:59:48
I am writing Flink CEP program inside the Lagom's Microservice Implementation. My FLINK CEP program run perfectly fine in simple scala application. But when i use this code inside the Lagom service implementation i am receiving the following exception Lagom Service Implementation override def start = ServiceCall[NotUsed, String] { val env = StreamExecutionEnvironment.getExecutionEnvironment var executionConfig = env.getConfig env.setParallelism(1) executionConfig.disableSysoutLogging() var topic_name="topic_test" var props= new Properties props.put("bootstrap.servers", "localhost:9092") props