Re: Why Spark having OutOfMemory Exception?

2016-04-21 Thread Zhan Zhang
; old data. > 3. Try windowing in spark or flink (have not used either). > > >Best regards / Mit freundlichen Grüßen / Sincères salutations >M. Lohith Samaga > > >-Original Message- >From: kramer2...@126.com<mailto:kramer2...@126.com> [mailto:kramer2...@126

Re:Re: Re: Re: Why Spark having OutOfMemory Exception?

2016-04-20 Thread 李明伟
3. Try windowing in spark or flink (have not used either). > > >Best regards / Mit freundlichen Grüßen / Sincères salutations >M. Lohith Samaga > > >-----Original Message- >From: kramer2...@126.com [mailto:kramer2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To: u

Re: Re: Re: Why Spark having OutOfMemory Exception?

2016-04-20 Thread Jeff Zhang
t. The frequency is 5 minute. >> So if use cassandra or hive, it means spark will have to read 24 hour >> data every 5 mintues. And among those data, a big part (like 23 hours or >> more ) will be repeatedly read. >> >> The window in spark is for stream computing. I did not use

Re:Re: Re: Why Spark having OutOfMemory Exception?

2016-04-20 Thread 李明伟
ry windowing in spark or flink (have not used either). > > >Best regards / Mit freundlichen Grüßen / Sincères salutations >M. Lohith Samaga > > >-Original Message- >From: kramer2...@126.com [mailto:kramer2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To:

Re: Re: Why Spark having OutOfMemory Exception?

2016-04-19 Thread Jeff Zhang
ore in Cassandra with TTL = 24 hours. When you read the full > > table, you get the latest 24 hours data. > > 2. Store in Hive as ORC file and use timestamp field to filter out the > > old data. > > 3. Try windowing in spark or flink (have not used either). > > > > >

Re:Re: Why Spark having OutOfMemory Exception?

2016-04-19 Thread 李明伟
reundlichen Grüßen / Sincères salutations >M. Lohith Samaga > > >-Original Message- >From: kramer2...@126.com [mailto:kramer2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To: user@spark.apache.org >Subject: Why Spark having OutOfMemory Exception? > >I use spark

Re: Why Spark having OutOfMemory Exception?

2016-04-18 Thread Zhan Zhang
nt: Monday, April 11, 2016 16.18 >To: user@spark.apache.org<mailto:user@spark.apache.org> >Subject: Why Spark having OutOfMemory Exception? > >I use spark to do some very simple calculation. The description is like below >(pseudo code): > > >While timestamp == 5 minut

Re:RE: Why Spark having OutOfMemory Exception?

2016-04-18 Thread 李明伟
r2...@126.com] >Sent: Monday, April 11, 2016 16.18 >To: user@spark.apache.org >Subject: Why Spark having OutOfMemory Exception? > >I use spark to do some very simple calculation. The description is like below >(pseudo code): > > >While timestamp == 5 minutes >

RE: Why Spark having OutOfMemory Exception?

2016-04-11 Thread Lohith Samaga M
either). Best regards / Mit freundlichen Grüßen / Sincères salutations M. Lohith Samaga -Original Message- From: kramer2...@126.com [mailto:kramer2...@126.com] Sent: Monday, April 11, 2016 16.18 To: user@spark.apache.org Subject: Why Spark having OutOfMemory Exception? I use spark to do

Why Spark having OutOfMemory Exception?

2016-04-11 Thread kramer2...@126.com
not have memory problem. I am wondering if there is lineage issue, but I am not sure. * -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Why-Spark-having-OutOfMemory-Exception-tp26743.html Sent from the Apache Spark User List mailing list archive