gt;>>> On Mon, May 9, 2016 at 4:43 PM, 李明伟 wrote:
>>>>>>
>>>>>> Thanks.
>>>>>> What if I use batch calculation instead of stream computing? Do I
>>>>>> still need that much memory? For example, if the 24 hour data set
g the docs, the most obvious way to solve this seems to set up a
spark stream with 5 minutes interval and two window which are 1 hour and 1
day.
But I am worrying that if the window is too big for one day and one hour. I
do not have much experience on spark stream, so what is the window length i
mich.talebza...@gmail.com> wrote:
>>>>>
>>>>> ok terms for Spark Streaming
>>>>>
>>>>> "Batch interval" is the basic interval at which the system with
>>>>> receive the data in batches.
>>>>> Thi
seems to set up a
spark stream with 5 minutes interval and two window which are 1 hour and 1
day.
But I am worrying that if the window is too big for one day and one hour. I
do not have much experience on spark stream, so what is the window length in
your environment?
Any official docs talki
m the source.
>>>>
>>>> Then you have these two params
>>>>
>>>> // window length - The duration of the window below that must be
>>>> multiple of batch interval n in = > StreamingContext(s
ream with 5 minutes interval and two window which are 1 hour and 1
day.
But I am worrying that if the window is too big for one day and one hour. I
do not have much experience on spark stream, so what is the window length in
your environment?
Any official docs talking about this?
--
e
>>> multiples of the batch interval, as received data is divided into batches
>>> of duration "batch interval".
>>>
>>> If you want to collect 1 hour data then windowLength = 12 * 5 * 60
>>> seconds
>>> If you want to collect 24 hour
ve much experience on spark stream, so what is the window length in
your environment?
Any official docs talking about this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-w
Mich Talebzadeh
>>
>> LinkedIn *
>> https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>> http://talebzadehmich.wordpress.com
>>
>&
hat is the window length in
your environment?
Any official docs talking about this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be-tp26899.html
Sent from the Apache Spark User List mailing list archive at Nabble.
y big enough to fit the stream
> data for 24 hours. Usually memory is the limiting factor for the window
> size.
>
> Dhiraj Peechara
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be
-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be-tp26899p26903.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
ngth in
> your environment?
>
> Any official docs talking about this?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-wi
be calculated and considering use spark
> stream to do it.
>
> We need to generate three kinds of reports. The reports are based on
>
> 1. The last 5 minutes data
> 2. The last 1 hour data
> 3. The last 24 hour data
>
> The frequency of reports is 5 minu
ing about this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be-tp26899.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
ency of reports is 5 minutes.
>
> After reading the docs, the most obvious way to solve this seems to set up
> a
> spark stream with 5 minutes interval and two window which are 1 hour and 1
> day.
>
>
> But I am worrying that if the window is too big for one day and one hour.
h experience on spark stream, so what is the window length in
your environment?
Any official docs talking about this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be-tp26899.html
Sent from the
o set up
> a
> spark stream with 5 minutes interval and two window which are 1 hour and 1
> day.
>
>
> But I am worrying that if the window is too big for one day and one hour. I
> do not have much experience on spark stream, so what is the window length
> in
> your environ
bout this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be-tp26899.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To u
e reports are based on
>>>
>>> 1. The last 5 minutes data
>>> 2. The last 1 hour data
>>> 3. The last 24 hour data
>>>
>>> The frequency of reports is 5 minutes.
>>>
>>> After reading the docs, the most obvious way to solve
window length in
your environment?
Any official docs talking about this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be-tp26899.html
Sent from the Apache Sp
d and considering use spark
>>>> stream to do it.
>>>>
>>>> We need to generate three kinds of reports. The reports are based on
>>>>
>>>> 1. The last 5 minutes data
>>>> 2. The la
; The frequency of reports is 5 minutes.
>>
>> After reading the docs, the most obvious way to solve this seems to set
>> up a
>> spark stream with 5 minutes interval and two window which are 1 hour and 1
>> day.
>>
>>
>> But I am worrying that if th
> The frequency of reports is 5 minutes.
>>>
>>> After reading the docs, the most obvious way to solve this seems to set
>>> up a
>>> spark stream with 5 minutes interval and two window which are 1 hour and
>>> 1
>>> day.
>>>
and 1
day.
But I am worrying that if the window is too big for one day and one hour. I
do not have much experience on spark stream, so what is the window length in
your environment?
Any official docs talking about this?
--
View this message in context:
http://apache-spark-user-list.100
w which are 1 hour and 1
>> day.
>>
>>
>> But I am worrying that if the window is too big for one day and one hour.
>> I
>> do not have much experience on spark stream, so what is the window length
>> in
>> your environment?
>>
>> Any officia
ng that if the window is too big for one day and one hour. I
> do not have much experience on spark stream, so what is the window length
> in
> your environment?
>
> Any official docs talking about this?
>
>
>
>
> --
> View this message in context:
> http://apache-s
environment?
Any official docs talking about this?
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/How-big-the-spark-stream-window-could-be-tp26899.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
28 matches
Mail list logo