park
> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/>
>
>
>
> *From:* Mich Talebzadeh [mailto:mich.talebza...@gmail.com]
> *Sent:* Saturday, August 6, 2016 12:25 PM
> *To:* Mohammed Guller
> *Cc:* Jacek Laskowski; Saurav Sinha; user
ct: Re: Explanation regarding Spark Streaming
Hi,
I think the default storage level
<http://spark.apache.org/docs/latest/programming-guide.html#rdd-persistence> is
MEMORY_ONLY
HTH
Dr Mich Talebzadeh
LinkedIn
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcP
mmed
>
>
>
> *From:* Jacek Laskowski [mailto:ja...@japila.pl]
> *Sent:* Saturday, August 6, 2016 1:54 AM
> *To:* Mohammed Guller
> *Cc:* Saurav Sinha; user
> *Subject:* RE: Explanation regarding Spark Streaming
>
>
>
> Hi,
>
> Thanks for explanation, but it
performance even worse.
Mohammed
From: Jacek Laskowski [mailto:ja...@japila.pl]
Sent: Saturday, August 6, 2016 1:54 AM
To: Mohammed Guller
Cc: Saurav Sinha; user
Subject: RE: Explanation regarding Spark Streaming
Hi,
Thanks for explanation, but it does not prove Spark will OOM at some point. You
ntly at the same rate.
>>
>> Also keep in mind that windowing operations on a DStream implicitly
>> persist every RDD in a DStream in memory.
>>
>> Mohammed
>>
>> -Original Message-
>> From: Jacek Laskowski [mailto:ja...@japila.pl]
:ja...@japila.pl]
> Sent: Thursday, August 4, 2016 4:25 PM
> To: Mohammed Guller
> Cc: Saurav Sinha; user
> Subject: Re: Explanation regarding Spark Streaming
>
> On Fri, Aug 5, 2016 at 12:48 AM, Mohammed Guller <moham...@glassbeam.com>
> wrote:
> > and eventually you will run out of memory.
>
> Why? Mind elaborating?
>
> Jacek
>
[mailto:ja...@japila.pl]
Sent: Thursday, August 4, 2016 4:25 PM
To: Mohammed Guller
Cc: Saurav Sinha; user
Subject: Re: Explanation regarding Spark Streaming
On Fri, Aug 5, 2016 at 12:48 AM, Mohammed Guller <moham...@glassbeam.com> wrote:
> and eventually you will run out of memory.
On Fri, Aug 5, 2016 at 12:48 AM, Mohammed Guller wrote:
> and eventually you will run out of memory.
Why? Mind elaborating?
Jacek
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
will run out
> of memory.
>
>
>
> Mohammed
>
> Author: Big Data Analytics with Spark
> <http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/>
>
>
>
> *From:* Saurav Sinha [mailto:sauravsinh...@gmail.com]
> *Sent:* Wednesday, August
016 11:57 PM
To: user
Subject: Explanation regarding Spark Streaming
Hi,
I have query
Q1. What will happen if spark streaming job have batchDurationTime as 60 sec
and processing time of complete pipeline is greater then 60 sec.
--
Thanks and Regards,
Saurav Sinha
Contact: 9742879062
Hi,
I have query
Q1. What will happen if spark streaming job have batchDurationTime as 60
sec and processing time of complete pipeline is greater then 60 sec.
--
Thanks and Regards,
Saurav Sinha
Contact: 9742879062
11 matches
Mail list logo