If the problem is containers trying to use more memory then they allowed,
how do I limit them? I all ready have executor-memory 5G
Eran
On Tue, 15 Dec 2015 at 23:10 Zhan Zhang <zzh...@hortonworks.com> wrote:

> You should be able to get the logs from yarn by “yarn logs -applicationId
> xxx”, where you can possible find the cause.
>
> Thanks.
>
> Zhan Zhang
>
> On Dec 15, 2015, at 11:50 AM, Eran Witkon <eranwit...@gmail.com> wrote:
>
> > When running
> > val data = sc.wholeTextFile("someDir/*") data.count()
> >
> > I get numerous warning from yarn till I get aka association exception.
> > Can someone explain what happen when spark loads this rdd and can't fit
> it all in memory?
> > Based on the exception it looks like the server is disconnecting from
> yarn and failing... Any idea why? The code is simple but still failing...
> > Eran
>
>

Reply via email to