; old data.
> 3. Try windowing in spark or flink (have not used either).
>
>
>Best regards / Mit freundlichen Grüßen / Sincères salutations
>M. Lohith Samaga
>
>
>-Original Message-
>From: kramer2...@126.com<mailto:kramer2...@126.com> [mailto:kramer2...@126
3. Try windowing in spark or flink (have not used either).
>
>
>Best regards / Mit freundlichen Grüßen / Sincères salutations
>M. Lohith Samaga
>
>
>-----Original Message-
>From: kramer2...@126.com [mailto:kramer2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To: u
t. The frequency is 5 minute.
>> So if use cassandra or hive, it means spark will have to read 24 hour
>> data every 5 mintues. And among those data, a big part (like 23 hours or
>> more ) will be repeatedly read.
>>
>> The window in spark is for stream computing. I did not use
ry windowing in spark or flink (have not used either).
>
>
>Best regards / Mit freundlichen Grüßen / Sincères salutations
>M. Lohith Samaga
>
>
>-Original Message-
>From: kramer2...@126.com [mailto:kramer2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To:
ore in Cassandra with TTL = 24 hours. When you read the full
> > table, you get the latest 24 hours data.
> > 2. Store in Hive as ORC file and use timestamp field to filter out the
> > old data.
> > 3. Try windowing in spark or flink (have not used either).
> >
> >
>
reundlichen Grüßen / Sincères salutations
>M. Lohith Samaga
>
>
>-Original Message-
>From: kramer2...@126.com [mailto:kramer2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To: user@spark.apache.org
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark
nt: Monday, April 11, 2016 16.18
>To: user@spark.apache.org<mailto:user@spark.apache.org>
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark to do some very simple calculation. The description is like below
>(pseudo code):
>
>
>While timestamp == 5 minut
r2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To: user@spark.apache.org
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark to do some very simple calculation. The description is like below
>(pseudo code):
>
>
>While timestamp == 5 minutes
>
either).
Best regards / Mit freundlichen Grüßen / Sincères salutations
M. Lohith Samaga
-Original Message-
From: kramer2...@126.com [mailto:kramer2...@126.com]
Sent: Monday, April 11, 2016 16.18
To: user@spark.apache.org
Subject: Why Spark having OutOfMemory Exception?
I use spark to do
not have
memory problem.
I am wondering if there is lineage issue, but I am not sure.
*
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Why-Spark-having-OutOfMemory-Exception-tp26743.html
Sent from the Apache Spark User List mailing list archive
10 matches
Mail list logo