Big data architecture

2021-07-15 Thread Aissa Elaffani
Hello Guys, I'm sorry for asking you this question, it does not have any link with Apache Flink, but if someone can help I would be so grateful. I want to build a big data architecture for batch processing, we have a lot of data that is generated everyday and we receive it upon a lot of sources

multiple kafka topics

2020-08-09 Thread Aissa Elaffani
Hello Guys, I am working on a Flink application, in which I consume data from Apache Kafka, the data is published in three topics of the cluster, and I need to read from them, I suppose I can create three FlikKafkaConsumer constructors. The data I am consuming is in the same format {Id_sensor:,

CEP use case ?

2020-07-16 Thread Aissa Elaffani
Hello Guys, I have some sensors generating some data about their (température, humidity, positioning , ...) and I want to apply some rules (simple conditions, if température>25, ...), in order to define if the sensor is on "Normal" status or "Alerte" status. Do i need to use flink CEP, or just

ERROR submmiting a flink job

2020-07-14 Thread Aissa Elaffani
Hello Guys, I am trying to launch a FLINK app on a distance server, but I have this error message. org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID:

deployin a flink app on a server;

2020-07-13 Thread Aissa Elaffani
Hello Guys, Can someone please, explain to me how can I deploy a flink app on a server, the steps I need to flow in order to achieve that ? Sorry for disturbing you guys. Aissa

CEP use case !

2020-06-25 Thread Aissa Elaffani
Hello Guys, I am asking if the CEP Api can resolve my use case. Actually, I have a lot of sensors generating some data, and I want to apply a rules engine on those sensor's data,in order to define a "sensor_status" if it is Normal or Alert or warning.for each record I want to apply some conditions

Window Function use case;

2020-06-04 Thread Aissa Elaffani
Hello guys, I have a use case, where I am receiving data from sensors about their status (Normal or Alerte), {SensorID:"1", FactoryID:"1", Status:"Normal" ..}, a factory can contain a lot of sensors, so what I want to do is, if the status of one sensor in a factory, is Alerte I want to raise an

Data Stream Enrichement

2020-05-30 Thread Aissa Elaffani
Hello Guys, I want to enrich a data stream with some mongoDB data, and I am willing to use the RichFlatMapFunction, and I am lost , i don't know where to configure the connection with my MongoDB. Can anyone Help me in this ? Best, Aissa

multiple sources

2020-05-27 Thread Aissa Elaffani
Hello everyone, I hope you all doing well.I am reading from a Kafka topic some real-time messages produced by some sensors, and in order to do some aggregations, I need to enrich the stream with other data that are stocked in a mongoDB. So, I want to know if it is possible to work with two sources

Flink suggestions;

2020-05-14 Thread Aissa Elaffani
Hello Guys, I am a beginner in this field of real-time streaming and i am working with apache flink, and i ignore a lot of features of it, and actually I am building an application, in which i receive some sensors data in this format {"status": "Alerte", "classe": " ", "value": {"temperature":

MongoDB sink;

2020-05-06 Thread Aissa Elaffani
Hello , I want to sink my data to MongoDB but as far as I know there is no sink connector to MongoDB. How can I implement a MongoDB sink ? If there is any other solutions, I hope you can share with me.

MongoDB as a Sink;

2020-05-05 Thread Aissa Elaffani
Hello Guys, I am looking for some help concerning my flink sink, i want te output to be stocked in MongoDB database. As far as I know, there is no sink conector for MongoDB, and I need to implement one by my self, and i don't know how to do that. Can you please help me in this ?

Flink pipeline;

2020-05-05 Thread Aissa Elaffani
Hello Guys, I am new to the real-time streaming field, and I am trying to build a BIG DATA architecture for processing real-time streaming. I have some sensors that generate data in json format, they are sent to Apache kafka cluster then i want to consume them with Apache flinkin ordre to do some

Flink Deserialisation JSON to Java;

2020-05-04 Thread Aissa Elaffani
Hello, Please can you share with me, some demos or examples of deserialization with flink. I need to consume some kafka message produced by sensors in JSON format. here is my JSON message : {"date": "2018-05-31 15:10", "main": {"ph": 5.0, "whc": 60.0, "temperature": 9.5, "humidity": 96}, "id":