(e.g. 180 minutes) I would of course like to use these values and the missing ones to get values for the next 24 hours (one value per minute) should be predicted.
Thank you in advance.
Regards,
Daniela
Gesendet: Montag, 02. Januar 2017 um 22:30 Uhr
Von: "Marco Mistroni"
An:
, 02. Januar 2017 um 21:07 Uhr
Von: "Marco Mistroni"
An: "Daniela S"
Cc: User
Betreff: Re: Spark Streaming prediction
Hi
you might want to have a look at the Regression ML algorithm and integrate it in your SparkStreaming application, i m sure someone on the list has
Hi
I am trying to solve the following problem with Spark Streaming.
I receive timestamped events from Kafka. Each event refers to a device and contains values for every minute of the next 2 to 3 hours. What I would like to do is to predict the minute values for the next 24 hours. So I would li
Hi
I have some questions regarding Spark Streaming.
I receive a stream of JSON messages from Kafka.
The messages consist of a timestamp and an ID.
timestamp ID
2016-12-06 13:00 1
2016-12-06 13:40 5
...
In a database I have values for each ID:
ID m
Hi,
I am a newbie in Spark Streaming and have some questions.
1) Is it possible to group a stream in Spark Streaming like in Storm (field grouping)?
2) Could the batch size be used instead of a time window?
Thank you in advance.
Regards,
Daniela
-
Hi,
I would like to cache values and to use only the latest "valid" values to build
a sum.
In more detail, I receive values from devices periodically. I would like to add
up all the valid values each minute. But not every device sends a new value
every minute. And as long as there is no new val