of Washington
>
> On Nov 27, 2013 12:44 AM, "Liu, Raymond" wrote:
>
>
> How about memory usage, any GC problem? When you mention get stuck, you
> mean 0% or 1200% CPU while no progress?
>
> Raymond
>
> From: Vijay Gaikwad [mailto:vijay...@gmail.com]
> Sent:
, Raymond" wrote:
>>>
>>> How about memory usage, any GC problem? When you mention get stuck, you
>>> mean 0% or 1200% CPU while no progress?
>>>
>>> Raymond
>>>
>>> From: Vijay Gaikwad [mailto:vijay...@gmail.com]
>>> Sent:
;>
>> From: Vijay Gaikwad [mailto:vijay...@gmail.com]
>> Sent: Wednesday, November 27, 2013 2:54 PM
>> To: user@spark.incubator.apache.org
>> Subject: Re: local[k] job gets stuck - spark 0.8.0
>>
>> Hi Patrick,
>>
>> Sorry I don't have access to web UI.
&g
e:
> How about memory usage, any GC problem? When you mention get stuck, you
> mean 0% or 1200% CPU while no progress?
>
> Raymond
>
> From: Vijay Gaikwad [mailto:vijay...@gmail.com]
> Sent: Wednesday, November 27, 2013 2:54 PM
> To: user@spark.incubator.apache.org
> Su
How about memory usage, any GC problem? When you mention get stuck, you mean 0%
or 1200% CPU while no progress?
Raymond
From: Vijay Gaikwad [mailto:vijay...@gmail.com]
Sent: Wednesday, November 27, 2013 2:54 PM
To: user@spark.incubator.apache.org
Subject: Re: local[k] job gets stuck - spark
Hi Patrick,
Sorry I don’t have access to web UI.
So I have been running these jobs on larger servers and letting them run..
I have observed that when I run a job with “local[12]”, it runs for some time
on full throttle at 1200% CPU consumptions, but after some this processing
goes to 0%.
After
When it gets stuck, what does it show in the web UI? Also, can you run
a jstack on the process and attach the output... that might explain
what's going on.
On Mon, Nov 25, 2013 at 11:30 AM, Vijay Gaikwad wrote:
> I am using apache spark 0.8.0 to process a large data file and perform some
> basic
I am using apache spark 0.8.0 to process a large data file and perform some
basic .map and.reduceByKey operations on the RDD.
Since I am using a single machine with multiple processors, I mention local[8]
in the Master URL field while creating SparkContext
val sc = new SparkContext("local[8]",