ginal Message-
>From: kramer2...@126.com<mailto:kramer2...@126.com> [mailto:kramer2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To: user@spark.apache.org<mailto:user@spark.apache.org>
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark to do some very sim
iginal Message-
>From: kramer2...@126.com [mailto:kramer2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To: user@spark.apache.org
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark to do some very simple calculation. The description is like below
>(pseud
t;> >
>> >my_dict[timestamp] = df # Put the data frame into a dict
>> >
>> >delete_old_dataframe( my_dict ) # Delete old dataframe (timestamp is one
>> >24 hour before)
>> >
>> >big_df = merge(my_dict) # Merge the recent 24 hours
. Lohith Samaga
>
>
>-Original Message-
>From: kramer2...@126.com [mailto:kramer2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To: user@spark.apache.org
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark to do some very simple calculation. The d
ds
> Mingwei
>
>
>
>
>
> At 2016-04-11 19:09:48, "Lohith Samaga M" wrote:
> >Hi Kramer,
> > Some options:
> > 1. Store in Cassandra with TTL = 24 hours. When you read the full
> > table, you get the latest 24 hours data.
> >
essage-
>From: kramer2...@126.com [mailto:kramer2...@126.com]
>Sent: Monday, April 11, 2016 16.18
>To: user@spark.apache.org
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark to do some very simple calculation. The description is like below
>(pseudo code):
>
to know if any thing wrong about this model? Because it is very slow
>after started for a while and hit OutOfMemory. I know that my memory is
>enough. Also size of file is very small for test purpose. So should not have
>memory problem.
>
>I am wondering if there is lineage issue,
, April 11, 2016 16.18
>To: user@spark.apache.org
>Subject: Why Spark having OutOfMemory Exception?
>
>I use spark to do some very simple calculation. The description is like below
>(pseudo code):
>
>
>While timestamp == 5 minutes
>
>df = read_hdf() # Read h
either).
Best regards / Mit freundlichen Grüßen / Sincères salutations
M. Lohith Samaga
-Original Message-
From: kramer2...@126.com [mailto:kramer2...@126.com]
Sent: Monday, April 11, 2016 16.18
To: user@spark.apache.org
Subject: Why Spark having OutOfMemory Exception?
I use spark to do
not have
memory problem.
I am wondering if there is lineage issue, but I am not sure.
*
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Why-Spark-having-OutOfMemory-Exception-tp26743.html
Sent from the Apache Spark User List mailing list archive at
10 matches
Mail list logo