The error in the log file says:

*java.lang.OutOfMemoryError: GC overhead limit exceeded*

with certain task ID and the error repeats for further task IDs.

What could be the problem?

On Sun, Jan 18, 2015 at 2:45 PM, Deep Pradhan <pradhandeep1...@gmail.com>
wrote:

> Updating the Spark version means setting up the entire cluster once more?
> Or can we update it in some other way?
>
> On Sat, Jan 17, 2015 at 3:22 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Can you paste the code? Also you can try updating your spark version.
>>
>> Thanks
>> Best Regards
>>
>> On Sat, Jan 17, 2015 at 2:40 PM, Deep Pradhan <pradhandeep1...@gmail.com>
>> wrote:
>>
>>> Hi,
>>> I am using Spark-1.0.0 in a single node cluster. When I run a job with
>>> small data set it runs perfectly but when I use a data set of 350 KB, no
>>> output is being produced and when I try to run it the second time it is
>>> giving me an exception telling that SparkContext was shut down.
>>> Can anyone help me on this?
>>>
>>> Thank You
>>>
>>
>>
>

Reply via email to