The most common reason is yarn is killing containers as containers are
trying to use more memory than they are allowed to. Please try bumping it up
On 16 Dec 2015 06:50, "Eran Witkon" <eranwit...@gmail.com> wrote:

> When running
> val data = sc.wholeTextFile("someDir/*") data.count()
>
> I get numerous warning from yarn till I get aka association exception.
> Can someone explain what happen when spark loads this rdd and can't fit it
> all in memory?
> Based on the exception it looks like the server is disconnecting from yarn
> and failing... Any idea why? The code is simple but still failing...
> Eran
>

Reply via email to