t; <mmistr...@gmail.com>
An: "Daniela S" <daniela_4...@gmx.at>
Cc: User <user@spark.apache.org>
Betreff: Re: Re: Spark Streaming prediction
Apologies, perhaps i misunderstood your usecase.
My assumption was that you have 2-3 hours worth fo data and you want to
, 02. Januar 2017 um 21:07 Uhr
Von: "Marco Mistroni" <mmistr...@gmail.com>
An: "Daniela S" <daniela_4...@gmx.at>
Cc: User <user@spark.apache.org>
Betreff: Re: Spark Streaming prediction
Hi
you might want to have a look at the Regression ML algorithm a
Hi
I am trying to solve the following problem with Spark Streaming.
I receive timestamped events from Kafka. Each event refers to a device and contains values for every minute of the next 2 to 3 hours. What I would like to do is to predict the minute values for the next 24 hours. So I would
Hi
I have some questions regarding Spark Streaming.
I receive a stream of JSON messages from Kafka.
The messages consist of a timestamp and an ID.
timestamp ID
2016-12-06 13:00 1
2016-12-06 13:40 5
...
In a database I have values for each ID:
ID
Hi,
I am a newbie in Spark Streaming and have some questions.
1) Is it possible to group a stream in Spark Streaming like in Storm (field grouping)?
2) Could the batch size be used instead of a time window?
Thank you in advance.
Regards,
Daniela
Hi,
I would like to cache values and to use only the latest "valid" values to build
a sum.
In more detail, I receive values from devices periodically. I would like to add
up all the valid values each minute. But not every device sends a new value
every minute. And as long as there is no new